display huge data in batches of 100 every hour in mysql/php

fetch large data mysql php
how to fetch data faster from mysql
speed up mysql queries large tables
mysql query slow on large table
how to search millions of record in sql table faster?
mysql improve performance large tables
mysql large table performance
mysql update slow on large table

I have a database with more than 600 rows but I can only retrieve/display 100 every hour. So I use

select * from table ORDER BY id DESC LIMIT 100

to retrieve the first 100. How do I write a script that will retrieve the data in batches of 100 every 1hr so that I can use it in a cron job?

Possible solution.

  1. Add a field for to mark the record was already shown.
ALTER TABLE tablename
ADD COLUMN shown TINYINT NULL DEFAULT NULL;

NULL will mean that the record was not selected, 1 - that record is marked for selection, 0 - that record was already selected.

  1. When you need to select up to 100 records you

2.1. Mark records to be shown

UPDATE tablename
SET shown = 1
WHERE shown = 1 
   OR shown IS NULL
ORDER BY shown = 1 DESC, id ASC
LIMIT 100;

shown = 1 condition in WHERE considered the fact that some records were marked but were not selected due to some error. shown = 1 DESC re-marks such records before non-marked.

If there is 100 or less records which were not selected all of them will be marked, else only 100 records with lower id (most ancient) will be marked.

2.2. Select marked records.

SELECT *
FROM tablename
WHERE shown = 1
ORDER BY id
LIMIT 100;

2.3. Mark selected records.

UPDATE tablename
SET shown = 0
WHERE shown = 1
ORDER BY id
LIMIT 100;

This is applicable when only one client selects the records.

If a lot of clients may work in parallel, and only one cliens must select a record, then use some cliens number (unique over all clients) for to mark a record for selection instead of 1.

Of course if there is only one client, and you guarantee that selection will not fail, you may simply store last shown ID somewhere (on the client side, or in some service table on the MySQL side) and simply select "next 100" starting from this stored ID:

SELECT *
FROM tablename
WHERE id > @stored_id
ORDER BY id
LIMIT 100;

and

SELECT MAX(id)
FROM tablename
WHERE id > @stored_id
ORDER BY id
LIMIT 100;

for to store instead of previous @stored_id.

Why MySQL Could Be Slow With Large Tables?, If you design your data wisely, considering what MySQL can do and what Note – any database management system is different in some 2+ IOs to retrieve the row – which means you get about 100 rows/sec. or not, different selectivity might show benefit from using indexes. QueryType # dim/hour % How To Display Data From Database Table In PHP/MySQL Using PDO Query Submitted by alpha_luna on Thursday, May 5, 2016 - 15:06. In this tutorial, we are going to learn on How To Display Data From Database Table In PHP/MySQL Using PDO Query .

Thank you @Akina and @Vivek_23 for your contributions. I was able to figure out an easier way to go about it.

  1. Add a new field to table, eg shownstatus
  2. Create a cronjob to display 100 (LIMIT 100) records with their shownstatus not marked as shown from table every hour and then update each record's shownstatus to shown NB. If I create a cronjob to run every hour for the whole day, I can get all records displayed and their shownstatus updated to shown by close of day.
  3. Create a second cronjob to update all record's shownstatus to notshown

The downside to this is that, you can only display a total of 2,400 records a day. ie. 100 records every hour times 24hrs. So if your record grows to about 10,000. You will need to set your cronjob to run for atleast 5 days to display all records.

Still open to a better approach if there's any, but till then, I will have to just stick to this for now.

How to update 10 million+ rows in MySQL single table as Fast as , The LOAD DATA INFILE statement reads rows from a text file into a table at the value of join_buffer_size defines how large the batch of keys is in show 3 more comments If you're able to add an updated time stamp to your table, you can but with autocommit or more reasonable transaction batches. I have an excel sheet which has a large amount of data. I am using php to insert the data into mysql server. I have two problems. 1) I have to update a row if the id already exists, else insert the data. 2) BIG PROBLEM : I have more than 40,000 rows and the time out on the sql server which is set by the admin is 60 seconds.

Let's say you made a cron that hits a URL something like

http://yourdomain.com/fetch-rows

or a script for instance, like

your_project_folder/fetch-rows.php

Let's say you have a DB table in place that looks something like this:

| id | offset | created_at          |
|----|--------|---------------------|
| 1  | 100    | 2019-01-08 03:15:00 |
| 2  | 200    | 2019-01-08 04:15:00 |

Your script:

<?php

define('FETCH_LIMIT',100);

$conn = mysqli_connect(....); // connect to DB

$result = mysqli_query($conn,"select * from cron_hit_table where id = (select max(id) from cron_hit_table)")); // select the last record to get the latest offset

$offset = 0; // initial default offset

if(mysqli_num_rows($result) > 0){
    $offset = intval(mysqli_fetch_assoc($result)['offset']);
}


// Now, hit your query with $offset included

$result = mysqli_query($conn,"select * from table ORDER BY id DESC LIMIT $offset,100");

while($row = mysqli_fetch_assoc($result)){
    // your data processing
}

// insert new row to store next offset for next cron hit

$offset += FETCH_LIMIT; // increment current offset

mysqli_query($conn,"insert into cron_hit_table(offset) values($offset)"); // because ID would be auto increment and created_at would have default value as current_timestamp


mysqli_close($conn);

Whenever cron hits, you fetch last row from your hit table to get the offset. Hit the query with that offset and store the next offset for next hit in your table.

Update:

As pointed out by @Dharman in the comments, you can use PDO for more abstracted way of dealing with different types of database(but make sure you have appropriate driver for it, see checklist of drivers PDO supports to be sure) along with minor checks of query syntaxes.

4. Query Performance Optimization, We show you how to find out how MySQL executes a particular query, and 100 most recent articles for a news site when they only need to show 10 of them on the front page). The client library then fetches all the data and discards most of it. If you find that a huge number of rows were examined to produce� PHP MySQL Connection string with example PDO: Connection string for PDO Checking MySQL installation by PHPinfo Paging:Breaking all records to fixed number of records per page with option to Previous and Next page PDO: PHP connection class to various database including MySQL PHP MySql Query PHP MySql Data Display Displaying one record in a page

MySQL Big DELETEs, Brought to you by Rick James. The Problem. How to batch DELETE lots of rows from a large table? Here is an example of purging items older than 30 days: Related Code: Display Data From Database Table In PHP/MySQL Using PDO Query Our previous tutorial, PHP Inserting Data To MySQL.For this follow-up tutorial, we are going to create PHP Select Data In MySQL to view or to show the data from MySQL Database Table.

Using a Staging Table for Efficient MySQL Data Warehouse Ingestion, Big dump of data once an hour, versus continual stream of records. This blog discusses various techniques for staging and batch processing. "Batch INSERTs" (100-1000 rows per SQL) eliminates much of the issues of the above bullet items. Confused? Build and execute a "Pivot" SELECT (showing rows as columns). A simple HTML form containing fields – Name, Address Line1, Address Line2, and E-mail. On submitting the form, the data is inserted into MySQL table using PHP code. We will iterate over each record present in the table to display MySQL data in an HTML table.

Top 10 MySQL Mistakes Made By PHP Developers, If you're using PHP, you're probably using MySQL–an integral part of the LAMP and most new developers can write functional code within a few hours. key constraints or transactions, which are essential for data integrity. Courses Web: PHP-MySQL JavaScript Node.js Ajax HTML CSS (149) Node.js Move and Copy file (95) Register and show online users and visitors (86) PHP-MySQL free course, online tutorials PHP MySQL code (72) Create simple Website with PHP (68)

Comments
  • You will have to store the next offset somewhere.
  • Does you need that the record must be selected only once? i.e. previously selected record must not be returned even if it matches "last 100" condition? If so I'd recommend to add a field "shown" into the structure and mark shown records in it. Because the amount of records added since last select may be over 100...
  • @JaySmoke If you are able to store the offset somewhere, DB row increase won't make any difference since you are fetching only 100 at once. Storing the offset can be in the form of json in a simple file or inside a database(recommended) in a single table.
  • @vivek_23 no, the db will be populated continually so it will always increase. The retrieval is just for display purposes. There won't be any deletion after the retrieval.
  • Are you in fact wishing to retrieve that most recently added 100 rows every hour? (which is a different question to the one asked)
  • I like the direction, thank you very much, but this doesn't show how this will automate the process every hour till all records are displayed?
  • @JaySmoke Create stored procedure which performs all described steps and selects marked records. Call this procedure from the client side by any convenient way.
  • Ok, so am thinking, all records are marked 1. I use cronjob to do 2.2 to select 100 every hour and mark them as 2, till it's all done. Once done, the cronjob will return null because all records are now marked 2. Then I will create a second cronjob to reset all records (2.3) back to 1? Does this make sense?
  • @JaySmoke No. Create stored procedure which performs all described steps and selects marked records.
  • You wrote Create a second cronjob to update all record's shownstatus to notshown and you can repeat the process over and over. Why would you do that? Doing this way, you always only show 2 same batches of records.