My Company wants to host their own sites. I need advice

So I work as an IT manager/Web Developer for a company that owns about 30 radio stations. The owner came to me and wants to look at hosting our own websites in one of his buildings. Can anyone point me to some good resources to research what I would need to make my company their own data center? The owner already agreed to a dedicated T1 line or whatever is available in our area. I am not sure what specs I need and how many servers what not. Some of our current sites are pretty high traffic and they also hope to setup a hub and spokes type vpn as well. We have lots of equipment racks and stuff like that and A/C controlled spaces in the building. I am just not sure what all equipment I need to include on my wish list to make this work. So I am looking at hosting about 30 websites with room for expansion but I am not sure where to start. I have pretty extensive knowledge of Lamp and security for my testing environments, but I am not sure what kind of resources it will take to have our own data center.

scylex,

I believe that your company is overlooking a lot of aspects like:

  • Redundant power
  • Redundant servers (separate locations)
  • Redundant communications lines
  • 7/24/365 monitoring
  • Extensive firewalls, i.e., keep the web servers out of the company’s internal network!
  • Extensive software (daemons to run a web server)

This will require massive expenditure on the part of the company and is NOT a reasonable thing to do. Better to get a managed hosting service to transfer the headaches to a professional TEAM. This is no reflection upon you, just that a production server requires constant monitoring (and specialized tools).

Regards,

DK

Yeah, I get what your saying, however, we are prepared to hire people and buy things. The owners of this company are billionaires who run oil companies. This radio business is a company that was started as a retirement project for them. They have decided what they want is to have all their sites hosted internally and they want me to come up with an action plan that details what we need to buy, what type of people we will need, what will our expansion plan be, ect… but on the same hand they don’t want to just waste money, they would like to only get what we need. Almost everything I find on the Internet is something like “let the professionals do it”. well, I have ran many un-managed dedicated servers for different clients over the years. I know what is involved with the operation side of a server. I have over 10 years experience as a web developer and 8 years experience as a Electronics Technician for the U.S. Navy. I have managed the backbone servers for ships and worked on communication equipment for a long time. I know about redundancy and cooling. We run 30 radio stations. The amount of equipment involved in running a radio stations is beyond most peoples comprehension. When the owners set out to build these stations they did not buy a dedicated radio station host to broadcast for them. They do not want to buy a managed dedicated host to run their websites. So, I respect your input that a managed host may be a cheaper and easier option but it is not our companies way.

My goal is to find resources and information for what is required to build our own data center. Things such as hardware recommendations and assistance in determining the specs of the servers we would need. How many servers would we need to handle the load of 30 to 50 websites?

We are not planning to run a data center for other people. And if the equipment goes offline we don’t have to freakout to get it back up. If the system was to go down overnight, we would get it up in the morning. Our sites provide a service to our community but they are not so critical that our system will need monitoring 24/7. That being said we would like them to be as stable as possible and have that 99% uptime. However some downtime is not going to break us. Believe me our current hosting is not that stable but the cms that runs the current system is provided by them as part of our package. I am moving all are sites to a new cms called pyrocms. This will free us to move our hosting, and what the owners want is to move to our own hosting system. So I am asking if anyone has information they can link me to to help me research and determine our actually hardware and personnel requirements.

If you are going to leave the site down for many hours just because it crashes after hours at your location - which will be right in the moddle of the day for a large part of the world) then you might as well not bother with it in the first place. Most people will give up completely on a site if it is ever down for more than 10 or 15 minutes. You will never get remotely close to 99% up time if you leave the site down for more than a few minutes each time it has problems.

The simplest way to have complete control over the hardware and software that the server is running without needing to worry about support issues is to use co-location. That;'s where you provide all the hardware and software and have it hosted in an existing datacentre. If nothing else that will save you the huge cost of the fibre optic connections required to get your server connected to the internet at appropriate seeds. It will probably also save you the cost of buying your own UPS (which would likely cost at least as much as the rest of the hardware if it is to keep things running for more than a few minutes if the power goes off.

Ok, We already have generators and UPS everywhere, We Run Radio Stations! We keep our stations up 99% of the time even in bad weather. Our local community sites are not designed to reach the other side of the globe. They are designed to service our community. If you are loading one of our sites like eldoark.com from the other side of the world you are doing it wrong. Although we plan to build a system that is rock solid and will have whats needed to keep it up around 99% of the time. Some down time is not the end of the world for us. We are in El Dorado, Arkansas. They don’t really have data centers around here. The closest one would be 2 hours away. You see like a local radio station we work on the local community level. I really like sitepoint and I thought we might be able to at least get some direction from you guys. I know you guys are trying to be helpful but trying to convince me that not doing a company data center is exactly what you find all over the internet. If I wanted to be told you should co-locate I would not have posted the question. I can get that answer from anywhere. What I need help with is trying to determine the scale of equipment we need to get? I will probably just call the sales guys at CDW. I am sure they will try their best to over sale me what I need though. It would be nice to find some materials to read that would help me determine what kind of systems we really need to do what we want. And if someday we find that our community site has grown to be a global site that is to taxing for our daytime staff…we can higher a night staff.

scylex,

Too much time and money on their hands, eh? I’m not used to that - a completely unfamiliar world for me!

That said, I do not have experience (nor expertise) in building a data center (Stephen’s idea is good but likely to be rejected, too, by your owners). Neither do I have the background to estimate whether a T1 would be sufficient when there’s a major load (you’ve got to size for the highest bandwidth anticipated as ADSL speeds x visitors x streaming file size …) - but you’d have a good handle on that.

What I would do in your place would be to visit a lot of hosting sites to see what services they offer and ASK what tools they use to provide those services - stressing that you only want to know the advantages of their hardware/software.

I see adverts all the time for “server” boxes (XEON processors but in a PC case) but I believe that would be insufficient for you (rack mounted servers should be easier to maintain).

The daemons required can easily be found in any WHM account.

My stumbling around like this is due to my lack of knowledge about any publications which would directly answer your question. Should you find such, please post!

Regards,

DK

Your primary restriction on the feasibility before you look at hardware, software and overall set up is connectivity, in particular bandwidth. If you are streaming to many 1000s of listeners simultaneously then this is going to be substantial, and may be beyond what the local infrastructure can support in a remote area. You’d want to be sure about how the bandwidth is billed, e.g 95 percentile or average, because if streaming is your primary consumption then the bandwidth saturation will be high. You’d also want more than one network provider to provide resilience - one clumsy roadworks dig could mean going offline for days if you are only hooked up to one. Find out whether you can get an economically viable solution before going further. Chances are you’d be paying the telco per mile for fibre laying and high bandwidth in the 100s of megabits, neither of which are services that normally belong in the same economic brackets as commmunity radio stations.

The reason you find the advice ‘don’t build a data centre’ all over the internet is because it’s right. It’s a similar endeavour to building your own private railways or roads, when you can use other railways and roads that are already built. Nonetheless, you might want to join the professional section at webhostingtalk, there are plenty data centre staff and large hosting company owners in there that have the knowledge of equipment you require, and probably staff and consultants you can recruit.

If you want to have your own datacentre, you might want to grab a box offsite somewhere for disaster recovery somewhere, and sync everything to it. If you have some sort of connectivity issue, or other local disaster, you’ll then have a fallback - you could run the sites from there till you’re online again.

There’s quite a lot to look at, but setting up a small datacentre should be fine. The basic points are:

  • cooling
  • raised floor, or ceiling conduits
  • rack mounting
  • surveillance
  • UPS and power conditioning
  • 2-3 independent network connections
  • routing and failover for network connections
  • procedure for DDOS and network problems
  • some sort of monitoring - pingdom might be enough
  • backup

The comments about data bandwidth above are probably the most important - make sure you have at least 2 independent network connections. You’ll also want to look at bandwidth usage if you are streaming, not too much of a problem from most locations in the US I’m sure.

You could easily run 30 sites on one larger LAMP server, and this should save costs - although of course this depends on bandwidth, but I’m assuming that’s low for a community radio station. If you only have one server, I’d have two others - one for fallback and one for development.

I’d run with mod_security, CSF and general server hardening to protect yourself against hacking and security issues. I’d also want to have at least some of my backups offline to protect against data wiping hacks, which are common enough these days. If you use nginx you’ll get a lot better performance out of your server than Apache.

Why not take a cage from someone likes Equinix, Coresite, or DFT. They will maintain the facility. It can also help to reduce the time for building and maintaining your “owned facility”. You are still independent there.

I totally agree.

Scylex

This is time consuming and a big head ache. Why to waste time money and energy on this. You have several hosting companies to handle so much of web space. Why does your boss want to break his head on unnecessary things?

Regards

San,

The answer (post #3) was that the owners have more dollars than sense. :nono:

Since it’s their choice, scylex’s job was to advise against their stupidity ONCE then follow their direction.

Regards,

DK

Agreed

But… Some of this can we solved via 3rd party CDN security + cloud service in front of you server.

Such a service, which may even include a PCI compliant WAF, can cost less then a 100$ and will provide Redundant Power, Servers & Communication lines. Full 7/24365 security monitoring & Firewalls managed by and outside team at no extra cost.

You will still need to monitor your servers and the setup itself is costly but in the end this is definetly doable.