It seems as if most of my colleagues are using GIT in a number of areas for version control, team development and overall organization. I'm taking it upon myself to see if git really has a place in my day to day workflow and if it makes sense to utilize.
We develop a lot of small website. Development teams are really small (ie. 1-3 people per project).
Right now our current (non-vcs) system is like this:
1) Development Server - login via FTP and grab the files you need to work on (edit locally)
2) Make the changes and push back to dev server.
3) Test in browser
4) If all goes well push to the main live web server.
1) Backups are always manual or dependent on weekly backup script to run.
2) It's hard to keep track of who's working on what without them "checking in with you".
3) If a problem arises down the road related to a fix we are limited to reverting to an entirely old backup (if it exists) or manually repairing the bug.
We utilize Wordpress a lot. Is it worthwhile to use git to track these type of builds?
In addition - If a repo is private - is it safe to include wp-config.php so it's an easy push to the master (live) branch?
If we use GIT locally do we have to use github to serve the projects or can we do this all without github?
I can see a few advantages to adopting git - especially as I'm getting more requests to be involved where git is being utilized.
Just curious to hear how git has been adopted in other small web firms.
I am not a small web firm but dear god yes. We are deploying basically everything through Mercurial, which is equivalent to git, these days, including static sites and wordpress.
Especially wordpress. The huge challenge there, when you have a number of wordpress installs, is keeping the core current in a sane manner and keeping development transportable. How we handle this is we have a central wordpress core repository containing wordpress and a couple extensions we use across the board. We then fork our sites from there. When the core gets updated, we pull in the upstream changes and merge them into the project. Another feature of this setup is a default way to build and spin up the site (either locally in IIS or using Vagrant if you are not on windows) and pull in the database. Net effect is presuming I've got a few pre-requisites installed, I can have any of the wordpress properties running locally and in near full-fidelity instance up and running that can easily be pushed to production.
This isn't without challenges -- though most of them are really shifts in workflow to keep things in the right places for the version control and automated deployments to be successful. The two big issues are not using some of the wordpress tools to upgrade the site but rather managing those locally and pushing through version control. The other issue I see lots of design-types have with things is The main tricks run around separating operating data and content from code. Things like not using the content upload for your CSS images, for example. But it certainly is doable and in our experience it works quite well.
Anyhow, gotta run to catch the football games but feel free to ask away.
wwb - thanks for jumping in here. I guess that brings a new complexity to my question. When using a CMS how do you handle the user uploaded content in the case you are using GIT? Do you pull from the master server frequently?
The way we handle it is the user uploaded content is beyond the scope in general. In the case of wordpress, 85% of stuff is in the database which you can find ways to get to the developers -- smaller sites can keep a reference mysql dump with the source. The images and such that end up in wp-content/uploads aren't a problem as wordpress puts absolute links within the content so the pages will render fine even if they are running on a local machine.
Or, don't put content in git. Put the code in there. Content is just something that is plugged in and is pretty immaterial on this level.
This topic is now closed. New replies are no longer allowed.