English 中文(简体)
Using cURL to download large XML files
原标题:

I m working with PHP and need to parse a number of fairly large XML files (50-75MB uncompressed). The issue, however, is that these XML files are stored remotely and will need to be downloaded before I can parse them.

Having thought about the issue, I think using a system() call in PHP in order to initiate a cURL transfer is probably the best way to avoid timeouts and PHP memory limits.

Has anyone done anything like this before? Specifically, what should I pass to cURL to download the remote file and ensure it s saved to a local folder of my choice?

最佳回答

you can try this:

function download($src, $dst) {
        $f = fopen($src,  rb );
        $o = fopen($dst,  wb );
        while (!feof($f)) {
            if (fwrite($o, fread($f, 2048)) === FALSE) {
                   return 1;
            }
        }
        fclose($f);
        fclose($o);
        return 0;
}
download($url,$target);
if ( file_exists($target) ){
   # do your stuff
}
问题回答

暂无回答




相关问题
编辑大案

在我可以预见到需要编辑的大量档案(主要是固定文本档案,但可以是CSV、固定-width、XML,......迄今为止)。 我需要发展......。

How can I quickly parse large (>10GB) files?

I have to process text files 10-20GB in size of the format: field1 field2 field3 field4 field5 I would like to parse the data from each line of field2 into one of several files; the file this gets ...

gcc/g++: error when compiling large file

I have a auto-generated C++ source file, around 40 MB in size. It largely consists of push_back commands for some vectors and string constants that shall be pushed. When I try to compile this file, g+...

Is git worth for managing many files bigger than 500MB

I would put under version control a big amount of data, i.e. a directory structure (with depth<=5) with hundreds files with size about 500Mb). The things I need is a system that help me: - to ...

热门标签