You must consider many questions before designing your architecture. For example, how quickly you need to get data to the other system (maybe it’s needed ASAP, maybe it’s sufficient to transfer data in bulk at regular basis, such every hour). What’s the expected number of requests? What should happen in case of a failure, such as when one of the systems isn’t available? And so on…
If you can process data in bulk, you may want to Data Management APIs to export and import files.
If you want to process individual transactions ASAP, the notification from F&O (when a tax transaction is created) could be implemented as a business event. For example, it could be put into a message queue and an application could read it from there and update the other system. Regarding update of F&O, utilizing OData services may be the best approach. Consider decoupling the two systems by using a message queue or something; you can easily utilize Logic Apps to read messages from the queue and call OData services through the F&O connector.
Regarding your pictures, the first one seems to suggest that you would add another database to F&O envirornment. It’s not possible. But it shouldn’t be a problem - none of the approaches I mentioned above need any extra database. But if you want, you can put a database somewhere, working in the role of a message queue. It would be more expensive, though, because you either must pay for Azure SQL or host your own VM and license your own database server.
In either case, you can’t use DB triggers in F&O database.