Page 1 of 1

Excessive temporary storage while updating CLOB fields via SQLRPGLE program

Posted: Wed Jan 01, 2025 7:25 pm
by ksharath
Summary of the Issue :

A Batch job running under the HTTP server on IBM - I , meant for processing database updates to CLOBs using JSON_UPDATE SQL function shows excessive amount of temporary storage being consumed ( close to 8 GB )

Explanation of the process:
We have designed a new database file with a CLOB field meant to store JSON data in it.
The Service to update the JSON elements in the CLOB field is coded in SQLRPGLE and plugged into the end of an IBM I Job running under the IBM I - HTTP server meant to process REST API calls from Online applications.
So on an average each API call involves updating around 10 to 15 fields that need to be updated in the individual JSON elements all stored in the CLOB field.

So the way we have implemented this is as below .

Step 1 : Read the JSON data from the CLOB from the new file and store into a global variable defined as SQLTYPE : CLOB .

Code: Select all

dcl-s jsonData SQLTYPE(CLOB :  75000);   
Step 2 : Now for every field that needs to be updated within a single API call , use the JSON_UPDATE SQL function to manipulate the SQLTYPE:CLOB variable .

Code: Select all

exec sql values

               (JSON_UPDATE(:jsonData, 'SET', trim(:path), trim(:newValue)

               )) into :jsonData;  

Note: We are repeatedly invoking this above function for every field being updated via the API call


Step 3 : Now once all the fields are updated , we transfer the JSON value from the global variable to the database using an SQL update.

While implementing this approach , what we observed is , the SQL function JSON_UPDATE to do the JSON manipulation , is causing a sudden spike in Jobs temporary storage ( increases from 450MB to 3 GB suddenly quite randomly while doing any of the updates). We are measuring this temporary storage in the bucket number greater than 65535 , meant for the job temporary storage.

I am failing to understand , if this is caused due to the repeated invocation of JSON_UPDATE functions, (or) could this be a memory leak (or) is there a PTF to check this.


Any thoughts please.

Re: Excessive temporary storage while updating CLOB fields via SQLRPGLE program

Posted: Thu Jan 02, 2025 8:01 pm
by Scott Klement
I'm not familiar with this. I never use SQL for JSON.

Have you asked IBM?

Re: Excessive temporary storage while updating CLOB fields via SQLRPGLE program

Posted: Fri Jan 03, 2025 3:49 am
by ksharath
Thanks Scott

I am trying to get in touch with the IBM meanwhile.

Also Would you have any better ideas for doing a PATCH to JSON elements , without using the SQL.

I have been looking at the YAJL procedures , and seems like they only provide exports to do a GET from JSON. Not sure if we are able to PATCH json elements using YAJL procedures.

Please suggest.

Re: Excessive temporary storage while updating CLOB fields via SQLRPGLE program

Posted: Fri Jan 03, 2025 4:27 am
by Scott Klement
Yes. In the current version of YAJL there are subprocedures called YAJL_object_insert and YAJL_array_add. These allow you to insert a new key into an object or add a new element to an array, respectively. To tell it to replace the value of an existing element in an object, you can use the relationship flag 'yajl_replace'.

Once you've made the changes, you can output a new file or string and it will contain the updates.

Re: Excessive temporary storage while updating CLOB fields via SQLRPGLE program

Posted: Mon Mar 03, 2025 9:27 pm
by ksharath
Thanks Scott

We managed to get some help from IBM , who have provided a PTF to resole this issue.

But we will look to implement YAJL procedures in future releases for JSON handling instead of using the SQL functions