Hi
I am wondering what those of you that have queries that select many of thousands of rows do to not kill the CPU or the Webpage displaying the results?
I have a very simple php page that has an equally simple SQL query. This query selects roughly 84000 postal_codes. Now I normally wouldn’t be selecting all postal_codes, but chose this table as it had a fair number of rows and was simple to query not involving Joins:
<?php
require_once('./libs/page_queries/db_handler_base.php');
$o_Db = new DbHandlerBase();
$sql = 'SELECT postal_code, city_name, state_prov, country FROM postal_codes';
$o_Db->setSQL($sql);
$results = $o_Db->runCursorQuery();
?>
When the runCursorQuery() method is run flat-lines the cpus of my local machine (which is quite fast with 4Mb of RAM).
The corresponding db_handler_base.php that is included has the following:
<?php Class DbHandlerBase {
protected $pdo;
protected $stmt;
protected $sql;
public function __construct(){
$this->pdo = new DbFactory();
}
public function runQuery(){
if($this->sql){
$this->stmt = $this->pdo->db->prepare($this->sql);
try {
if ($this->stmt->execute()) {
$result = $this->stmt->fetchAll(PDO::FETCH_ASSOC);
return($result);
}
} catch (PDOException $e) {
print ("Could not execute statement.\
");
print ("errorCode: " . $sth->errorCode () . "\
");
print ("errorInfo: " . join (", ", $sth->errorInfo ()) . "\
");
}
} else {
throw new Exception('$sql parameter needs to be set');
}
}
public function setSQL($sql){
$this->sql = $sql;
}
public function runCursorQuery() {
try {
$this->stmt=$this->pdo->db->prepare($this->sql, array(PDO::ATTR_CURSOR =>PDO::CURSOR_SCROLL));
$this->stmt ->execute();
$result = array();
while ($row = $this->stmt ->fetch(PDO::FETCH_NUM, PDO::FETCH_ORI_NEXT)) {
$result[] = $row;
}
return $result;
} catch (PDOException $e) {
print $e->getMessage();
}
}
}
Class DbFactory {
public $db;
protected $db_factory;
function __construct() {
return $this->db = new PDO("mysql:host=localhost;dbname=users;charset=UTF-8", "user", "userpassword");
}
}?>
I offloaded the responsibility of the query to the server by using the cursors.
I know this is just a contrived test, but I know when my application gets done it would not be uncommon for a user to query and return 45000 or so records and know that I will hit this again. I know that I can raise the memory used by php on my server php.ini configuration, but feel that this might mask a more basic problem.
Any thoughts on how I might get this so I don’t tax the local users machines, especially when they will not likely be near as fast or as much RAM as my development machine!
By the way, when I do this same search using the MySQL workbench or from the command line the query takes about 24 seconds.
Thanks,
Steve