I'm not sure OC is asking for NoSQL - just a better experience with SQL migrations.
The relational model always requires a judgement call about how normalized to go. If I have a deep nested json object of data, you could argue to define all the relations and normalize. But then to query the data you now have an 6-way join and the query planner starts to struggle a bit. And then if you decide to change that structure, you now have a very tricky migration script to run.
Whereas if you just kept it as a json object, you have avoided a lot of unnecessary pain, and you can just write some quick re-mapping code in the application layer to handle the older version of data, and you can actually start storing data in the new schema immediately. Then you can scan over the collection and modify the existing data of the old schema, and remove your re-mapping layer.
So yes, your data has been relational the whole time, but the query complexity and migration complexity increased dramatically. It became very difficult to change the model.
The relational model is essentially a constraint on how you need to structure your data in order to allow relational algebra to be used to help optimize query plans. When you release these constraints, you find things like Datalog that provide purer ways to represent your data and relationships, in the sense that they can map closer to the real-world - but then this comes at the cost of automated query optimization.
So the fact that relational data models are usually less representative of the real-world because of adherence to the relational model, usually denormalized to optimize the underlying database engine, and difficult to migrate, create complexity and slow down shipping some features (although they can also speed up many others greatly - like analytics).
That being said, SQL dbs are probably the best dbs we have today for solving business problems, and I think the migration and modeling problems could be solved with better tooling.
The relational model always requires a judgement call about how normalized to go. If I have a deep nested json object of data, you could argue to define all the relations and normalize. But then to query the data you now have an 6-way join and the query planner starts to struggle a bit. And then if you decide to change that structure, you now have a very tricky migration script to run.
Whereas if you just kept it as a json object, you have avoided a lot of unnecessary pain, and you can just write some quick re-mapping code in the application layer to handle the older version of data, and you can actually start storing data in the new schema immediately. Then you can scan over the collection and modify the existing data of the old schema, and remove your re-mapping layer.
So yes, your data has been relational the whole time, but the query complexity and migration complexity increased dramatically. It became very difficult to change the model.
The relational model is essentially a constraint on how you need to structure your data in order to allow relational algebra to be used to help optimize query plans. When you release these constraints, you find things like Datalog that provide purer ways to represent your data and relationships, in the sense that they can map closer to the real-world - but then this comes at the cost of automated query optimization.
So the fact that relational data models are usually less representative of the real-world because of adherence to the relational model, usually denormalized to optimize the underlying database engine, and difficult to migrate, create complexity and slow down shipping some features (although they can also speed up many others greatly - like analytics).
That being said, SQL dbs are probably the best dbs we have today for solving business problems, and I think the migration and modeling problems could be solved with better tooling.