Large File download issue

Hello,

I am working on a project that downloads large zip files from server, for small files the script works well and downlaod files successfully, but for larger files like currently we are trying to download a 922MB file it gives us this message (in firefox) and doesn’t download any thing.

"
File not found

Firefox can’t find the file at http://www.domainname.com/abc.zip

"
Script to download the file is as below:

"
$filename = "xyz.mp3;

header(“Pragma: public”);
header(“Expires: 0”);
header(“Cache-Control: must-revalidate, post-check=0, pre-check=0”);

header(“Content-Type: application/force-download”);
header(“Content-Type: application/octet-stream”);
header(“Content-Type: application/download”);

header(“Content-Disposition: attachment; filename=”.basename($filename).“;”);

header(“Content-Transfer-Encoding: binary”);
header("Content-Length: ".filesize($filename));

if( !ini_get(‘safe_mode’) )
set_time_limit(360000000);

readfile(“$filename”);
"

Please advise what can be issue, if its file size issue then how and where can we increase the limit to solve this issue.

pre-thanks,

I suggest you to try xsendfile for large files. You can also check this blogpost about [URL=“http://codeutopia.net/blog/2009/03/06/sending-files-better-apache-mod_xsendfile-and-php/”]sending files with apache mod_xsendfile and php.

ps. You can remove double quotes from readfile(“$filename”);

GoDaddy has asked for the following in order to install mod_xsend on our server:

“Written acknowledgment/agreement of the following potential issues, including, but not limited to:
- Functionality differences between the old and new software.
- Compatibility issues of installed services/applications/dependencies.
- Application version differences (PHP, MySQL, etc.).”

Please advise. Thanks

Is their an actual reason why you are using PHP to send the file? Why don’t you have the server do this work like it was meant to do it? Ie., have the server deliver the file like it does your images. Large files like this 900 MB file is just going to kill PHPs memory limit.

@logic_earth
What do you mean by using server to send files?

I mean…the server how it sends images? You just enter the image in the address bar of your browser and the server sends it, no PHP involved.

These are all paid Zips and if we use browser’s address bar(http://www.domainnaem.com/zips/abc.zip) to download zips then the path will be seen by everyone and it will not be secure.

Thats probably what is going on - it’s probably causing an error and thus the file stream won’t be output to the browser.

@op: You need to rethink the logic here. What you need to do is hide the files and output a header to them when you receive a valid token from your user (the token being supplied in the url).

With regards to the file itself, you need to read from the file in a loop and write them out to the client - not use readfile() which is only good for smaller files. For larger files you need to read a few KB and print that in a loop. This means that you only ever have a few KB of the file in the memory at any time.

OF course by doing that, you then need to support pause and resume so hit google for a function called: dl_file_resumable

That function will also show you how to read from a file and print it out a few KB at a time.

I’d suggest using .htaccess to set those headers instead of php. As a rule wrapping a file send in PHP is a disastrously bad idea… let apache or whatever http server you are using do it’s job!

Yes but if the downloads are being paid for then as the op says you need to protect your income otherwise people will just pass the url around for a free download. THAT is why php is very useful for download situations like this.

GoDaddy has installed mod_xsend on our server but we are still not able to download large files.

We are using these headers

header(‘Content-Disposition: attachment;filename=hello.txt’);
header(‘X-Sendfile: /home/username/hello.txt’);

Any suggestions?
Thanks

Then you can make individual URL’s for those files that expire after some time, for example after 12 hours. If a user requests a file then you give them a unique URL like

http://www.domainnaem.com/048g7h1mh89ho1jobdbx3ied9f0xim9fx49mh7yw/abc.zip

You generate a random string for the URL so that no one can guess it - and it is not a folder name although it looks like that. Then use PHP to add this URL to a rewrite rule in your .htaccess file so that the file is accessible for download using this particular URL. In this way you skip PHP entirely in the download process and your server is serving the file directly.

Just remember to run a script every X minutes which will remove obsolete URLs from your .htaccess so that they expire. You can use some database to keep track of them.

I have seen some music download sites use this technique. I don’t know what kind of scripting they use but they generate unique download URLs that are accessible for 24 hours. This has the advantage that people can use their favourite download managers to get the file, which can be important for such huge downloads.

Well, if that’s the code line by line, i think a quick throw into the PHP colorizer will show you where your error is…



$filename = "xyz.mp3;

header("Pragma: public");
header("Expires: 0"); 
header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); 

header("Content-Type: application/force-download");
header("Content-Type: application/octet-stream");
header("Content-Type: application/download");

header("Content-Disposition: attachment; filename=".basename($filename).";");

header("Content-Transfer-Encoding: binary");
header("Content-Length: ".filesize($filename));

if( !ini_get('safe_mode') )
set_time_limit(360000000);

readfile("$filename"); 

*hums to the tune of “Little Black Raincloud” * I’m just a dangling open string…sitting at the top of your file…