Using a cloud service to constantly backup a live database is a bit like expecting two users to have full access and control of the same database file at the same time. As joint usage continues; and/or the size of the database increases; and the volume of changes grows the chances of both 'users' attempting to perform some action on the database at precisely the same moment will become greater, i.e. What is going to happen if a RootsMagic User wants to write new data or a change existing data in their database at the same time as the Cloud service is synchronising the existing file data (i.e. overwriting) the copy it retains in the Cloud? Similarly, what would you expect to happen if you were in the middle of editing an existing document on a networked computer and someone else on another linked computer attempted to access and save a duplicate copy of the existing document at the same time?
True network aware database software would lock any individual records in use to prevent their data being copied during a backup/synchronisation process, or if another user attempted to access the same record; it would retain any edits in a temporary file to ensure they were only committed to the main database after that backup/synchronisation process had been completed, which then helps to prevent data conflict/corruption. RootsMagic, in common with many other genealogy programs, uses an SQLite database model that is not designed or intended to work in that way. That's a downside to SQLite that is far outweighed by the upside of being entirely free for developers to use in their desktop software, which keeps (should keep) costs down for both the developer and end user.
Cloud services will, when correctly designed, often lock individual files during the backup/transmission process, but that of course can easily lead to the local user believing that their system/application has 'locked up' without warning, especially if a considerable amount of data needs to be synchronised, or if the internet connection is slow, which is why the recommended action is to suspend Cloud Backup services while using files that are being constantly written to.
If I remember correctly, Gramps does not currently use a traditional Database model, but stores its data in an XML File, i.e. a special form of text file which would suffer in the same way as any other text file if you tried to edit it and copy it at the same time.
When you mention 'GEDCOM compatible entry" in relation to genealogy software in general the main issue is the wide variety of interpretations of the GEDCOM standard applied by software producers, i.e. problem it is not, strictly speaking, the GEDCOM standard that is the problem, but miss-understanding or miss-application of the standard by programmers. It's a bit like the English language: There is only one English language, but there are many 'locally created' dialects which limits understanding and confuses users. With GEDCOM this often means that either:
- some data may not be exported fully or correctly to a GEDCOM file by some software applications; plus
- some data which is exported to a GEDCOM file by one product may not be correctly imported by another product because it doesn't fully understand the dialect used.
As far as 'an official lists' of GEDCOM incompatibilities go, that's a bit like an "How long is a piece of string?" question, given the volume of current genealogy products available; plus those products/versions that are no longer supported. One attempt was made here: http://genealogytool...apps-crosswalk/ to inform FTM users when Ancestry announced that they were dropping support for desktop software; and before the FTM software was sold-on to Software MacKiev.
The cynic in me also says that, while most genealogy products claim 'GEDCOM compatibility':
- their developers are rarely clear about which version of the GEDCOM Standard they mean and tend to adopt a 'mix and match' approach across multiple versions of the standard to suit their own interpretation, despite how long it has been since the Standard was last amended;
- few software developers provide any specific detail on where their software does/doesn't conform with the standard and what data is likely to be 'lost' when exporting data to GEDCOM;
- most Software Users have a naive belief in developers' claims that their chosen software does everything properly; and that it's always the fault of some other software when data doesn't transfer fully/correctly; and
- there is limited (minimal) business benefit or commercial advantage in software developers enabling their users to easily move their data to other software products with 100% reliability.