Notifications
Announcements
No record found.
Hello all,
I am trying to get JSON data from a API in my D365 FO project. In one request, I am getting data which is of more than 100 MB. As I need to store the Serialized JSON data in an String variable, I got to know that there is a limit on how much data the String variable can store. I read the blog http://dev.goshoom.net/2012/05/insufficient-memory-en/ but I couldn't get how to resolve it for D365 Finance and Operations as its not a rich client application.
Please suggest how can I somehow modify the configurations (On premises and Cloud server) in order to let String variable store atleast 100 or 150 MB of data.
Thank you in advance.
It seems that your application will need a lot of memory...
What if you try System.String or System.IO.MemoryStream instead of X++ string type? Of course, what's the best choice depends on what you want to do with the string.
Thanks Martin for the response.
I did try System.String to store the data but looks like it too comes with the memory limitations.
As the string contains serialized JSON data, I want to de-serialize it and then process the data. The only issue is that I am not being able to store such large amount of serialized data.
Is there any configurations which I can change to define the memory limits of variables?
What if you run the same in C#? will you hit the same error?
At which line of code do you get the error (when using System.String)?
Have you already tried MemoryStream?
Notice that Data Management, for example, it able to process much larger files. But it doen't try to put everything to a single string.
No, I don't think you can configure it. There might be something you could do in development environments, but you wouldn't be able to do it production.
Here is my code
streamReader = new System.IO.StreamReader(response.GetResponseStream()); (Reading from Http response)
resJson = streamReader.ReadToEnd(); (Here I get the error when i try to assign the string from the stream to a System.String/Str variable)
As the data is in JSON format, Even if I try to read in a batch of lets say 1000 characters, It would be hard to ready complete objects and then parse them.
I have not tried MemoryStream yet, again the question is how will i read the complete data in chunks?
In scenarios where I donot have control over how much data the Resource sends me from which I am getting the data, how would you have handled the situation?
Your code is using a stream, even if you don't realize that. But why do you want to featch the whole steam content to a string? How are you processing it? Ideally this thing is not needed at all, but we can't offer any better solution until you explain why you're doing what you're doing.
Okay Martin let me tell you what I am trying to do.
I have an interface which sends me Items (JSON data) based on certain inputs, Then I create those items in my D365FO.
So I make a http Request with the interface and read the data it send via a stream. After this I read the stream and assign it to a string which I then de serialize using Contract classes.
This enables me to read the JSON data using contract classes and then process it.
There is no issue in this process until the JSON data I receive is less than 50MB. But if it exceeds 50MB then the system throws OutOfMemory Exception.
I'm not sure if the error is coming from streamReader or string. String has an upper limit, but if I m not wrong it needs to be more than 50mb (about 2gb, I guess).
If the error is coming from a string, you can try it with a memo type variable.
If it's coming from StreamReader, you can try to get the values via streamReader.readLine () method.
Other than that, streamReader can be split into certain sizes if I remember correctly (streamReader.read should have such a feature). You can try to take the parts and join them on a string.
Thank you Ergun for your response.
The error is coming from String as it have a memory limit.
Could you please tell me about memo type variable? I couldn't find about its usage in D365 FO.
If the API can return any amount of data, which you put to a string in memory, it sounds like a flaw in architecture to me. It might run into various limits sooner or later and it will require a huge amount of memory. For example, if the data has 150 MB, you'll have it at least twice (in the request and in your string), which means 300 MB of RAM before you even started to do anything. Parsing the string will require additional resources and what if there will be multiple calls like this, multiplying the amount of RAM needed?
A common solution to this problem is pagination.
Nevertheless I've just assinged 150 MB to a string without any error. I tested it by uploading a large text file, using the following code:
class MyClass { public static void main(Args _args) { FileUploadResultBase result = File::GetFileFromUser(); if (result.getUploadStatus()) { using (var reader = new System.IO.StreamReader(result.openResult())) { str content = reader.ReadToEnd(); info('success'); } } } }
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.
As AI tools become more common, we’re introducing a Responsible AI Use…
We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…
These are the community rock stars!
Stay up to date on forum activity by subscribing.
Martin Dráb 664 Most Valuable Professional
André Arnaud de Cal... 522 Super User 2025 Season 2
Sohaib Cheema 303 User Group Leader