Loading from Large Database Table with 3 Million Records!

The total record count is usually one of the most expensive database operations due to sequential scan reading information of every row. When the table is very large, the performance takes a huge hit.

With a large database, we can simply disable the counting operation by calling enable_pagecount() and pass parameter “false”.

1
$dg->enable_pagecount(false);

Below is the entire code for the demo. Note that the table salaries has nearly 3 million records!

1
2
3
4
5
6
7
8
9
// Always include namespace and conf.php on TOP of the script.
use phpCtrl\C_DataGrid;
require_once("/path/to/conf.php");  

$dg = new C_DataGrid("SELECT * FROM salaries", "emp_no", "salaries")
$dg->enable_edit('INLINE')->set_dimension('800px');
$dg->enable_pagecount(false);
$dg->enable_export(true);
$dg -> display();

In general, to improve overall performance of very large database tables, it is advisable to:

  • Use either an INT or BIGINT datatype than say a GUID
  • Partitioning may reduce the size of indexes, effectively reducing the table size
  • More memory or even SSD drive can work wonders!
  • Remove any unnecessary indexes on the table
  • Remove SELECT COUNT(*) and always include a query filter.