Here there Ivan:
So a few things.
1- To look at the size of each row,you can look at the DataType of each column to get an idea of the size of data you're moving. This has a sample procedure that can give you can use for : rows and this will give you an idea of Table Size. Regarding compression, if the rows are compressed it'll need to be uncompressed to be ingested but unless you've enabled it, I don't think that's an issue.
2-Understood, so performance will be an issue but transfer time is generally a lot less than simply having your contacts conflate and then associating activities unified. I can't say it's always true, but if you have a lot of activities, chances are that Activity unification in the refresh process will be the time killer. Usually ingestion is pretty fast, (a few minutes).
3-Ok, so you can use CSV, Excel, Parquet. ON Data ingestion, the middle option is to mount a CDM folder instead of using the Power Query interface. But that requires a little more work. So if you just did a full dump of your table to CSV/Parquet/JSON or pick the file format and wrote it to either Azure Blob Storage or a Data Lake, you can just connect to it just like a database. Other than setting up the connection initially, you can't really tell the difference. So in this instance of your screen shot, pick a file format, say Parquet or CSV, then you just specify the container it's in and the key, it'll authenticate and after that, everything is exactly the same.
You can use OneDrive or box or something different but Azure Blob storage or ADL are a lot more robust.
I think you're table size is well below 10gb but run those queries and we can figure it out.