I want to get all post IDs of my product pages. But either the first way or the second way are succesfull…
I always get:
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 8388616 bytes) in /var/www/vhosts/httpdocs/wp-includes/wp-db.php on line 1842
First way:
if ( ! defined('ABSPATH') ) {
/** Set up WordPress environment */
require_once( dirname( __FILE__ ) . '/wp-load.php' );
}
$posts_array = get_posts(array(
'post_type' => 'product',
'posts_per_page' => -1
));
$myfile = fopen("wp_all_import.txt", "a");
foreach ($posts_array as $value) {
fwrite($myfile, $posts_array . "\n");
}
fclose($myfile);
Second way:
if ( ! defined('ABSPATH') ) {
/** Set up WordPress environment */
require_once( dirname( __FILE__ ) . '/wp-load.php' );
}
$products_IDs = new WP_Query( array(
'post_type' => 'product',
'posts_per_page' => -1
));
$myfile = fopen("wp_all_import.txt", "a");
while ($products_IDs->have_posts() ) : $products_IDs->the_post();
fwrite($myfile, get_the_ID() . "\n");
endwhile; wp_reset_query();
fclose($myfile);
Does anybody know where is my failure and how I can fix that?
I just want to get all the post IDs of my product posts.
Greetings and Thank You!
-
If all you want is printing ID in a file, then you may write custom query for it. That way you’ll be able to avoid some internal processing WordPress does.
-
Many posts may exhaust your RAM, although I don’t think just selecting ID of 2100 posts should really eat up 134MB RAM. Just do the math, ID
can be saved in just 1 byte, but lets say it’s taking 4 bytes. Still, 2100 x 4 = 8400 Bytes = 8.4 KB. Obviously PHP needs more internal memory to process, to create objects etc. But with 134MB memory, I could easily process few hundred thousand ID. So obviously you are doing wrong somewhere else.
Anyways, for whatever reason (may be you need to select everything from product
, not just ID), you may segment the query with limits. Like the following CODE:
if ( ! defined('ABSPATH') ) {
/** Set up WordPress environment */
require_once( dirname( __FILE__ ) . '/wp-load.php' );
}
// $limit determines how many rows you want to handle at any given time
// increase / decrease this limit to see how much your server can handle at a time
$limit = 100;
$start = 0;
// open file handle
$myfile = fopen( dirname( __FILE__ ) . '/wp_all_import.txt', 'a' );
$qry = "SELECT ID FROM `$wpdb->posts` where post_type="post" AND post_status="publish" limit %d, %d";
while( $result = $wpdb->get_results( $wpdb->prepare( $qry, array( $start, $limit ) ) ) ) {
$write_data="";
foreach ( $result as $row ) {
$write_data = $write_data . $row->ID . "\n";
}
// Generally speaking, writing immidiately to the file is better than
// string concatination, because depending on what you concat and how many times,
// the generated string may become too big (like MB size string).
// On the other hand, writing to files thousands of times in a single script may
// cause I/O delays. So, here I've done a combination of the two to keep both
// string size & I/O within limits.
// Adjust this according to your own situation.
fwrite( $myfile, $write_data );
$start = $start + $limit;
}
// close file handle
fclose( $myfile );
This way PHP will only handle maximum $limit
number of rows, so memory limit should not cross.
Note: never ever concatenate to make very long strings (like MB long), write immediately to the file before it becomes too long. It may produce some I/O delay, but it’ll not exhaust memory limit.