Hi,
If let's say I have 3 recurring jobs that run each 1 min
Which means each 1 min, 3 records in batch job form are going to be created
My first question is:
If one of the recurring jobs exported more than 2000 records, is it still going to create only 1 record in batch job ? I mean is there a limitation on how many lines the entity can export in 1 batch job?
Now, we said we have 3 recurring jobs and all those are connected to the empty batch group. And let's say, I have another button where each time it's clicked, an entity gets exported in batch -- which means also a record gets created in batch job form. And also batch group here is empty.
My second question is:
If 3 recurring batch job records got created at the same time in addition to other 10 records from the button -- which is in total 13 records in batch job form created at the same time with the same empty batch group. Is there a limitation on how many jobs can be executed for the empty batch group at the same time? Or would there be a limit like only 10 jobs for example can be run together..so the other three need to wait until one of the 10 jobs finishes so that one of the three can start executing?
Hi DelDyn,
If you have a completely different behavior in production compared to UAT with the same coding, settings and load, you can create a ticket at Microsoft Support. They would be able to check what is going on.
Hi Andre,
Reliable async runs on a batch server. However you can see the record in batch job history form and not batch job form.
As i said earlier, this doesn't happen in my dev box or UAT, only in Prod. So i can't use a trace. That's why i'm asking if batch jobs are the reason for the delay but how can i detect that (you can check the other reply i made in the last two comments for extra details about the issue)
Hi Deldyn,
The reliable async is usually called without running via a batch job. You started the question about batch jobs and now you talk about reliable async. So, I'm confused now.
I don't have information about possible service limits for the runAsync options in the application. You can create a trace log and analyse the behavior in the Trace parser to get insights what is exacctly happening.
Hi Andre,
I'm not sure what do you mean by batch framework or sysOperationFramework directly?
I've mentioned in the previous comment that i'm using sysOperationFramework, in the posting method of the journal, i call a controller class where inside it i define a service class with execution mode is reliable async.
There is no dialog, it gets called in the posting method with reliable async so that the posting of the journal doesn't wait for the service logic.
Does this answer ur question?
Hi Deldyn,
I wrote a blog about the execution parameters in the past: Speed up data import with (data) entity execution parameters (dynamicspedia.com)
The workflow processing job was an example to mention that there can be more than one thread used for a batch job. Also when you are using the execution parameters and split the import in bundles it will use multiple tasks.
For your issue, can you tell if you are using the batch framework or directly the sysOperationFramework?
Hi Andre,
I have three questions please:
So even for import there is no limit unless we specify to split them to more than one job Right? And where can i find this execution parameter?
Also regarding the workflow job that u said has 100 task..is this found in every production environment? And how many times does it run? Is it recurring?
Now the reason i'm asking all those questions is to try and understand an issue i'm facing
Currently, when a button is clicked to post let's say for example a journal. I have a sysOperationFramework service with reliable async execution mode, that exports a composite entity, inserts in custom table that works as change tracking and sends the exported file to azure blob(in addition to few logic)
This is done so that files reach customer in real time. And this is working quickly in my dev box taking between 10 -15 seconds.
Now in prod, we are having a performance issue where people are saying that it takes more than one minute for the file to reach the customer.
I've looked at few examples and indeed i can see that the difference between the modifiedDateTime for the posting of the journal And the createdDateTime for inserting in my custom table which is done inside the batch after the posting, is sometimes more than 2 minutes!! And there are cases where it's quick.
And that made me wonder if batch jobs limit has something to do with it. Any idea? And what should i do in thus case?
Please note that there is also recurring job that works each min for the same composite entity.
So what happens is, in general, when the journal gets posted, the sysOperationFramework service inserts a record in the custom table and based on some logic, it detects if this journal should be sent real time or batch. If it was real time, it will export the entity now and send to azure blob. If not, it will wait for recurring job to pick it up (where recurring job is running every 1 min)
(there is a filter on the recurring data project and the real time date project based on status so that one exports real time records and the other one exports batch records)
Hi Deldyn,
Your understanding about the batch queue is correct.
About the first question, data managent export is taking the 20,000 records in one batch thread. Only when importing data, there is an option for execution parameters to split the work.
Hi Andre,
So if total is 144 and let's say i have one batch job with 44 tasks. In addition to 200 batch jobs with 1 task each.
Then this means that the one batch job with 44 tasks will run in addition to 100 jobs at the same time right?
Now for the other 100, it's going to wait until "one of the 100 jobs or 1 job with 44 tasks" finish to be able to add one from the waiting 100.
Is that right? Is this what you meant?
Regarding my 1st question, I'm talking about recurring job which is standard...the question was, if one of the the recurring jobs in the queue is exporting more 20,000 record. This will still execute only 1 batch job for the first recurrence right? I mean number of exported lines in each batch is not limited and there is no max limit?
Hi Deldyn,
12 * 12 would be 144 threads. Note that some batch jobs, like the workflow message processing job can create multiple tasks. This will be bundled by 100 workflow messages. Each task will consume a thread. This is all regardless of the batch group. I do hope this answers your second question.
Can you elaborate on your first question? Are you using the standard Data Import/Export features or a custom export job?
André Arnaud de Cal...
291,965
Super User 2025 Season 1
Martin Dráb
230,817
Most Valuable Professional
nmaenpaa
101,156