That you have 100000 records doesn't mean that you must, or should, make 100000 separate requests. The overhead would slow everything down and it could cause so severe performance problems that Microsoft actively protects the ERP system from excessive number of requests.
You can, for example, create a file with all the records and push it to F&O throw a single request of a data management API. Or you could create a smaller packages, e.g. by a few thousand or records are send them to a data management API, to a custom service or so. You won't have a problem with the number of incoming requests if you don't use any, e.g. if you have a process in F&O that gets the data from a message queue in Azure. Message queues are also good for load distribution, e.g. if you there are many requests in a short period, you just a longer queue instead of trying to make too many requests to F&O and run into the protection limits.
Note that the problem isn't in Power Automate and Logic Apps; it's in the logic you have in there.
RIS is a client application for calling data management APIs from a Windows machine based on files in a folder. You definitely should consider data management APIs (as the very first thing, I'd say), but you can call the API by any other mean, e.g. from an Azure Function, from Power Automate, directly from another application and so on.
I don't know why you think that using a custom service is the most efficient way, but ultimately it's about the business logic you implement there. You still can implement efficient bulk imports inside a custom service, but you also can use custom services in very inefficient way. And this applies to the other options too.