I was just wondering to know what is the best method to cache results in php.
Suppose you have search implemented on a website that is for directory search where a user inputs parameters like city, state, country, and a list of pages is fetched, on each page 10 results are displyed, but if the results are too heavy, the results take some time to populate when a user navigates from one page to other, In that scenario i want to implement some sort of caching so that results are cached and stored somewhere so that the results load quickly.
I also want to control no of results that can be cached so that caching itself doesn't take much time.
Please suggest some best techniques.
A couple of ideas spring to mind, some of them you well already be using and I'm sorry if some of them seem obvious.
Are you in fact Filtering rather than Searching?
Check that your database tables INDEXES are maximised
Use the mysql LIMIT clause so you only fetch 10 rows at a time.
Identify which are the popular searches, concentrate on caching them. (do you record the filter terms? If this uses the GET method to filter, you might be able to grep through your logfiles looking for patterns of popular filter terms).
If you are using the GET method to handle your filtered data, installing a semi-automated opcode cache like APC might help.
When deciding what to cache, consider whether you want to cache just the data, or a html segment with containing the data set formatted ready to be slotted straight into the template of your page.
Make sure you are clear about how often the data changes, this may dictate how long the cache has to live "ttl" (time to live)
This isn't a checklist BTW, just a couple of thoughts you might want consider.
This topic is now closed. New replies are no longer allowed.