Stream Style Processing ResultSet Node

The regular select query from mysql in nodejs calls the provided callback function with all the selected data. If you select 10 rows from the database you are ok, however if you select one million, that is no longer OK. Here is how you process large mysql result sets with stream style operation.

Normally, processing millions of transactions is not for a web service (express) but for a command line application. But command line application or not, being able to process a stream of records instead of a bulk is a big improvement.

Here is the code:

  1.  
  2. var stream = require('stream');
  3. var mysql = require('mysql');
  4.  
  5. process.on('unhandledRejection', (err) => {
  6. console.error('an error was thrown: ' + err)
  7. process.exit(1)
  8. });
  9.  
  10. var conn = mysql.createConnection({
  11. host: 'localhost',
  12. user: 'user_here',
  13. password: 'password_here', database: 'dbname'
  14. }
  15. );
  16.  
  17.  
  18. var i = 0;
  19.  
  20. conn.query('select id, name, size from alfa order by id').stream({
  21. highWaterMark: 5
  22. }).pipe(stream.Transform({objectMode: true,
  23. transform: function(data, encoding, callback){
  24. i ++;
  25.  
  26. console.log(i);
  27. callback();
  28. }})).on('finish', function(){
  29. console.log('finished');
  30. conn.end();
  31. });
  32.  

In the example above highWatermark configures how big is the database fetch buffer