Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.0k views
in Technique[技术] by (71.8m points)

memory - Increasing PHP memory_limit. At what point does it become insane?

In a system I am currently working on, there is one process that loads large amount of data into an array for sorting/aggregating/whatever. I know this process needs optimising for memory usage, but in the short term it just needs to work.

Given the amount of data loaded into the array, we keep hitting the memory limit. It has been increased several times, and I am wondering is there a point where increasing it becomes generally a bad idea? or is it only a matter of how much RAM the machine has?

The machine has 2GB of RAM and the memory_limit is currently set at 1.5GB. We can easily add more RAM to the machine (and will anyway).

Have others encountered this kind of issue? and what were the solutions?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

The configuration for the memory_limit of PHP running as an Apache module to server webpages has to take into consideration how many Apache process you can have at the same time on the machine -- see the MaxClients configuration option for Apache.

If MaxClients is 100 and you have 2,000 MB of RAM, a very quick calculation will show that you should not use more than 20 MB *(because 20 MB * 100 clients = 2 GB or RAM, ie the total amount of memory your server has)* for the memory_limit value.

And this is without considering that there are probably other things running on the same server, like MySQL, the system itself, ... And that Apache is probably already using some memory for itself.

Or course, this is also a "worst case scenario", that considers that each PHP page is using the maximum amount of memory it can.


In your case, if you need such a big amount of memory for only one job, I would not increase the memory_limit for P?P running as an Apache module.

Instead, I would launch that job from command-line (or via a cron job), and specify a higher memory_limit specificaly in this one and only case.

This can be done with the -d option of php, like :

$ php -d memory_limit=1GB temp.php
string(3) "1GB"

Considering, in this case, that temp.php only contains :

var_dump(ini_get('memory_limit'));

In my opinion, this is way safer than increasing the memory_limit for the PHP module for Apache -- and it's what I usually do when I have a large dataset, or some really heavy stuff I cannot optimize or paginate.


If you need to define several values for the PHP CLI execution, you can also tell it to use another configuration file, instead of the default php.ini, with the -c option :

php -c /etc/phpcli.ini temp.php

That way, you have :

  • /etc/php.ini for Apache, with low memory_limit, low max_execution_time, ...
  • and /etc/phpcli.ini for batches run from command-line, with virtually no limit

This ensures your batches will be able to run -- and you'll still have security for your website (memory_limit and max_execution_time being security measures)


Still, if you have the time to optimize your script, you should ; for instance, in that kind of situation where you have to deal with lots of data, pagination is a must-have ;-)


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...