shaydez — 2010-06-26T20:33:17-04:00 — #1
I'm not sure if this goes under PHP or Database
I have a database with over 100k rows of search results and tracking information.
When i try query and compile a report it takes FOREVER... is there any techniques to run faster queries ?
the database columns are
id | ipaddress | page | product | timedate | se_keywords
dan_grossman — 2010-06-26T20:52:19-04:00 — #2
Define indexes on the columns you use to filter rows (i.e. the WHERE clauses and JOIN conditions of the query). If you do that, 100k rows should be no problem at all.
shaydez — 2010-06-28T13:12:23-04:00 — #3
r937 — 2010-06-28T12:28:13-04:00 — #4
shaydez — 2010-06-28T12:24:57-04:00 — #5
yeah i know about indexes.. i just wasn't sure if there's a different method..
how do big companies handle hundred thousands of records, millions possibly.. Definitely if you're tracking thousands of users accessing pages daily, and products.