get_results on large datasets

I am experimenting some very disturbing problem using a custom SQL query.
I have a custom MySQL table called trading_datas having 165 lines for 4,4Mo. One field datas of my table is a big serialized array.

When requesting over it that way:

$querystr = "SELECT item, datas FROM trading_datas";
$result = $wpdb->get_results( $querystr, 'ARRAY_A' );

my PHP script stops and nothing gets echoed right after that line.
When retrieving only item everything works great.

Should I zip my serialized array before putting it in database? Any other solution?

Thanks a lot for help!

1
1

$wpdb doesn’t suit for fetching huge amount of data from database. Why? In your case:

  1. $wpdb->get_results( ... ) – fetches all results into your RAM at once. It means if you have 4mb, 10mb, or 50mb of data in db, everything will be stored in memory (what is limited as you know).
  2. $wpdb->get_results( ..., ARRAY_A )$wpdb fetches everything as object by default, and then if you pass ARRAY_A, it will convert array of objects into array of associated arrays. It means that if you’ve already fetched 4mb/10mb/50mb of data into memory, you have to multiply it 2 or even more times.

So, I suppose your issue is in reaching memory limit by your script. That’s why it works for one result and doesn’t work for all results. There is one workaround for this issue – use mysql_* function in the way you need it:

$result = @mysql_query( "SELECT item, datas FROM trading_datas", $wpdb->dbh );
if ( $result ) {
    while ( ( $row = @mysql_fetch_array( $result, MYSQL_ASSOC ) ) ) {
        // do stuff here for each result ...
    }
    @mysql_free_result( $result );
}

Leave a Comment