web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

No record found.

News and Announcements icon
Community site session details

Community site session details

Session Id :
Microsoft Dynamics 365 | Integration, Dataverse...
Suggested Answer

Warning: Apache Spark 3.5 upgrade causes sync issues

(2) ShareShare
ReportReport
Posted on by 29
Following upgrade to Apache Spark 3.5 in Dataverse, we noticed some tables stopped syncing to our external data warehouse. 
 
Spark jobs in Synapse Analytics are reporting the following errors:

Caused by: org.apache.spark.SparkUpgradeException: [INCONSISTENT_BEHAVIOR_CROSS_VERSION.WRITE_ANCIENT_DATETIME] You may get a different result due to the upgrading to Spark >= 3.0:
writing dates before 1582-10-15 or timestamps before 1900-01-01T00:00:00Z
into Parquet files can be dangerous, as the files may be read by Spark 2.x
or legacy versions of Hive later, which uses a legacy hybrid calendar that
is different from Spark 3.0+'s Proleptic Gregorian calendar. See more
details in SPARK-31404. You can set "spark.sql.parquet.datetimeRebaseModeInWrite" to "LEGACY" to rebase the
datetime values w.r.t. the calendar difference during writing, to get maximum
interoperability. Or set the config to "CORRECTED" to write the datetime
values as it is, if you are sure that the written files will only be read by
Spark 3.0+ or other systems that use Proleptic Gregorian calendar.
I have the same question (0)
  • PL-26031223-0 Profile Picture
    2 on at
    After upgrading, we are having errors as well, about 30-40 tables not syncing.  Some are but not others.  Anything that you did on your side to fix?
  • Suggested answer
    11manish Profile Picture
    337 on at
    This issue is related to a behavior change introduced in Apache Spark 3.x, which now enforces stricter handling of historical date/time values.
  • BK-27011901-0 Profile Picture
    29 on at
    Well, initially we ended up spinning up another environment and relinking all of our 255 tables using Spark 3.4 because we just couldn't wait for a fix. 
     
    Microsoft has fixed Spark 3.5 (about two weeks after we reported the issue to them), but if you have tables that stopped syncing, I think the only recourse is to unlink/link them again. 
     
    However, Spark 3.4 has caused another issue. All 'nvarchar' columns are now created as 'varchar'. 
    We have opened a case with Microsoft to return the functionality of Spark 3.3, whereas nvarchar->nvarchar. 
    Case has been open for 6 weeks so far. 

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Introducing the 2026 Season 1 community Super Users

Congratulations to our 2026 Super Stars!

Congratulations to our 2025 Community Spotlights

Thanks to all of our 2025 Community Spotlight stars!

Leaderboard > Microsoft Dynamics 365 | Integration, Dataverse, and general topics

#1
11manish Profile Picture

11manish 156

#2
Pallavi Phade Profile Picture

Pallavi Phade 102 Super User 2026 Season 1

#3
Abhilash Warrier Profile Picture

Abhilash Warrier 55 Super User 2026 Season 1

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans