Quantcast
Channel: Piwik Forums - Support & Bugs
Viewing all articles
Browse latest Browse all 13117

Re: Importing large amount of historical data

$
0
0
So what should I do here - bang the max memory for PHP up to 8GB and do the archive run again?

One problem I foresee is that, as I said, I have several websites to import, but one are anywhere near as big as the big one. Do I need to import all the data for all of them first, and then run the archive process - because it seems I need to always use --force-all-periods and --force-all-websites to import historical data, so I'd end up reprocessing, and I expect that's going to be slow.

I've had a look at the archive.php code, and it doesn't look like it'd take much effort to hack it to only process a particular website, and just a bit more effort to turn that into something specifiable on the command line. Is that a reasonable assessment?

I have one install where I've already read in this data but the archive process failed - if I rerun it there with the larger memory, will it happily start over?

Viewing all articles
Browse latest Browse all 13117

Trending Articles