How about frontpage?

True but it is still quite useful, especially for newbies.

I use the DW code editor all the time as it is a very good code editor and I have yet to find one that has all the functions in one place that I need. I wouldn’t buy it from scratch though if I didn’t have it already as I only use the code editor and none of the other stuff. The wysiwyg is more wysiwtf and of no use at all.

It’s code formatting, code collapsing, highlighting, error checking, search and replace over multiple files will be hard to beat and I’ve tried many other free (and paid) editors but none of them can match the power of DW or the ease of use. Sometimes when debugging in the forums all I have to do is copy the OPs html into the window and the error is immediately highlighted - something the validator is not very helpful at doing when the error is a missing quote or a missing end tag in a mangled page.

The only downside is that on large files it becomes very sluggish and in those cases I switch to htmlPad which isn’t bad.

It’s not wise to start fresh on discontinued software.

All of that is available in Aptana. I had to double check on the search & replace over multiple files, but it’s there.

all I have to do is copy the OPs html into the window and the error is immediately highlighted - something the validator is not very helpful at doing when the error is a missing quote or a missing end tag in a mangled page.

Aptana does highlight mangled HTML as you describe, but doesn’t appear to validate against strict W3C standards (though I could be mistaken).

I don’t know anybody who uses Dreamweaver in a professional environment. I reckon a designer could just about get away with it, but any web developers using Dreamweaver would create a bad smell in my book…

Yes, but if you’re using ftp for website deployment, you’re doing it wrong.

What makes you say that?

Granted, SFTP is sometimes available (but not always), and sometimes hosts have deployment packages available (though it varies from host-to-host whether or not they are kept up-to-date), but FTP is very much still alive.

FTP is just not the right way to do it.

Consider that you have these disadvantages:

  1. It’s VERY slow. Need to upload 1000 files in different directories? Go make a cup of coffee. Go make a full meal for that matter.
  2. You can’t track what has been uploaded and by whom
  3. Because of the above, it’s VERY easy to have different developers working on the same project go out of sync. It’s so easy for me to upload a file without realising I’m undoing another dev’s previous changes - leads to stuff breaking all the time
  4. If something breaks, you have to manually try to fix it - prepare to have real users of your system seeing broken stuff quite often.
  5. Because it takes so long for files to upload, even if your new code changes work, when uploading a bunch of files at once for an update, your site will be in a semi or non working state while the upload is in progress
  6. You have to actually remember all the files you’ve changed. Changed 50 files, including html/css/php/js for a single update? That’s cool. Now you either have to remember each one individually (VERY error prone) or upload the whole site again - which again is very slow and leaves the site in a semi-working/non-working state during the process
  7. You can’t really branch, and flipping between different features if you want to test them is very hard
  8. Knowing that one server, say production, is in the exact same state as say Staging is basically impossible
  9. It is fundamentally insecure

I believe these are good enough reasons to never use it.

The only acceptable use case for ftp for webdev in my opinion is a single dev working on a VERY small site. I can’t think of any other circumstances where I’d be happy to use it.

Bare bones ssh over a terminal and doing git/hg pulls is VASTLY superior to ftp, and that’s before you even look at purposely made deployment software.

Put it this way - if I’m told I HAVE to use ftp to work on something, I’ll either turn it down, or I’ll just assume the people I’m working with are amateurs.

Misconception. The time it takes to upload mostly depends upon your upload bandwidth capacity.

  1. You can’t track what has been uploaded and by whom

Sure, if everyone shares the same FTP account. But why do you have so many users uploading to the same directory that you can’t keep track of them?

  1. Because of the above, it’s VERY easy to have different developers working on the same project go out of sync. It’s so easy for me to upload a file without realising I’m undoing another dev’s previous changes - leads to stuff breaking all the time

You shouldn’t be using FTP as a development environment. You need version control software, such as CVS or Git. (Here’s a sitepoint article which gives an overview of Git: http://www.sitepoint.com/the-designers-guide-to-git-or-how-i-learned-to-stop-worrying-and-love-the-repository/ )

  1. If something breaks, you have to manually try to fix it - prepare to have real users of your system seeing broken stuff quite often.

Something tells me you’re not using a CMS. FTP has nothing to do with this. Somebody simply isn’t very good at managing files on the server.

  1. Because it takes so long for files to upload, even if your new code changes work, when uploading a bunch of files at once for an update, your site will be in a semi or non working state while the upload is in progress

Again, it sounds like you have upload capacity limitations.

  1. You have to actually remember all the files you’ve changed. Changed 50 files, including html/css/php/js for a single update? That’s cool. Now you either have to remember each one individually (VERY error prone) or upload the whole site again - which again is very slow and leaves the site in a semi-working/non-working state during the process

Again, version control.

  1. You can’t really branch, and flipping between different features if you want to test them is very hard

I don’t know what you’re saying here, but again, it sounds like a development/versioning problem. Not an FTP problem.

  1. Knowing that one server, say production, is in the exact same state as say Staging is basically impossible

This sounds like another software development problem, rather than a problem with FTP.

  1. It is fundamentally insecure

Ok, finally one point that is directly related to FTP. Yes, usernames/passwords and data are sent in the clear. Check with your host to see if SFTP is available.

I believe these are good enough reasons to never use it.

Only 1 or 2 of your points are directly related to problems with FTP. The others are problems that arise during the course of software development or because of too many cooks in the kitchen.

The only acceptable use case for ftp for webdev in my opinion is a single dev working on a VERY small site. I can’t think of any other circumstances where I’d be happy to use it.

FTP has uses beyond web development.

Bare bones ssh over a terminal and doing git/hg pulls is VASTLY superior to ftp, and that’s before you even look at purposely made deployment software.

In terms of security and version control, yes.

Put it this way - if I’m told I HAVE to use ftp to work on something, I’ll either turn it down, or I’ll just assume the people I’m working with are amateurs.

Don’t assume this. There can be several practical reasons as to why folks use FTP.

Not true. It’s just really slow. A thousand files means a thousand read/write operations - each file is individually uploaded and so it takes a long time. Even with awesome bandwidth, it’s still slow to upload 1000+ individual files.

Sure, if everyone shares the same FTP account. But why do you have so many users uploading to the same directory that you can’t keep track of them?

I’m talking about using ftp for deployment, but not only this - the idea of many developers deploying over ftp at the same time. Means things go out of sync on the server easily.

You shouldn’t be using FTP as a development environment. You need version control software, such as CVS or Git. (Here’s a sitepoint article which gives an overview of Git: http://www.sitepoint.com/the-designers-guide-to-git-or-how-i-learned-to-stop-worrying-and-love-the-repository/ )

I don’t use ftp. I use version control already. That’s kinda the point I’m making. I don’t use ftp for development or deployment or anything else for that matter.

Something tells me you’re not using a CMS. FTP has nothing to do with this. Somebody simply isn’t very good at managing files on the server.

I have no idea what you are going on about with this comment. Not using a CMS? What has this got to do with anything? What are you referring to my management of files for? I never experience any of these problems.

Again, it sounds like you have upload capacity limitations.

No.

Again, version control.

Which I already use (what is your point?)

I don’t know what you’re saying here, but again, it sounds like a development/versioning problem. Not an FTP problem.

I’m talking about doing this directly on a live server. If you use git/mercurial on the server itself (and deploy that way), then you can actually have branches on the server and flip between them. It’s especially good for staging environments. It’s not a problem. I don’t have a problem.

This sounds like another software development problem, rather than a problem with FTP.

I don’t have a problem. Using ftp for deployment means you can’t do this on the server so easily. SSH’ing into the machine and using VC means you can guarantee versions between different servers are identical.

Ok, finally one point that is directly related to FTP. Yes, usernames/passwords and data are sent in the clear. Check with your host to see if SFTP is available.

I don’t need to check anything with my host, because I install everything on the machine and manage all my servers myself, from the command line. I don’t use sftp, just like I don’t use ftp. For anything.

Only 1 or 2 of your points are directly related to problems with FTP. The others are problems that arise during the course of software development or because of too many cooks in the kitchen.

No, they are all legitimately related to using ftp for deploying websites.

FTP has uses beyond web development.

Never said it didn’t. I’m not talking about ftp itself. I’m talking about using ftp for deploying websites, which you should never do.

In terms of security and version control, yes.

Which is the entire point I was making.

Don’t assume this. There can be several practical reasons as to why folks use FTP.

Examples?

Try Evrsoft First Page. It is free, and easy to use. What I like about it is, you can use wysiwyg and html at the same time.

Yeah, use some rubbish software from 2006!

FTP is a perfectly valid way to transfer files, so long as you understand its limitations. Its name tells you what it’s designed to do.

You can increase the number of simultaneous uploads (filezilla has this feature), which speeds up the process. However, it does transfer each file in its entirety, which does slow down the process when compared to versioning systems, which only transfer the changes.

I’m talking about using ftp for deployment, but not only this - the idea of many developers deploying over ftp at the same time. Means things go out of sync on the server easily.

That’s a matter of either choosing the right tool for the job, or a problem with too many cooks in the kitchen.

I have no idea what you are going on about with this comment. Not using a CMS? What has this got to do with anything? What are you referring to my management of files for? I never experience any of these problems.

I’ve seen where some folks try to manage files on their web server using FTP, when they really should be using a CMS instead to organize things. Of course, if you’re talking development source files, this doesn’t exactly apply. I was thinking more user-facing downloadable files.

I don’t need to check anything with my host, because I install everything on the machine and manage all my servers myself, from the command line. I don’t use sftp, just like I don’t use ftp. For anything.

Not everyone uses their own dedicated box. Some use a managed hosting service.

Never said it didn’t. I’m not talking about ftp itself. I’m talking about using ftp for deploying websites, which you should never do.

For large-scale deployments, I agree. For small-scale stuff, it’s acceptable.

I think apart from Dreamweaver, you also try Notepad++. It might sound odd me suggesting Notepad++, but the fact is that it is free yet have good interface to develop site; whereas Dreamweaver is not completely free. Dreamweaver is licensed paid application, so it has to be better in functionality, but if you know how to code then Notepad++ is also not bad!!

The whole point I was making is that FTP is not appropriate for website deployment. For transferring a couple of files across the internet, it’s fine, but for actually deploying websites it sucks.

Another problem I have with ftp is that in my experience, people who use it for deployment generally don’t use version control, and even worse - often they setup their IDE (which for people deploying websites via FTP is usually something quite poor like Dreamweaver) to automatically upload files when they save them after editing. I’ve seen people say this is one of their favourite “features” of their IDE before now - this is something that shouldn’t even be possible. It’s so dangerous.

My experience with using FTP for deployment in practice has been entirely negative. People should be using either a direct ssh connection to pull from a remote repository, or some other dedicated deployment service that handles everything for them.

Good thread - what is the latest offering from Microsoft for website development?

Microsoft Visual Studio 2012