Azimut Marketplace runs a modular financial services platform where partners plug in at different depths, from full API integrations to forms with back office workflows, they're working with some pretty cool stuff, and not just the usual Jamstackey Headless CMS use cases.
We sat down to catch up with Lorenzo, their CTO, who's been integrating DatoCMS to do a lot more than just generating frontends.
Lorenzo’s first order of business as CTO was getting costs and operational risk down. They trialed self-hosting, then chose DatoCMS for reliability and because they did not want to own CMS infrastructure at all. The migration off their old vendor was scripted and fast, then the team rewrote their APIs against GraphQL. The punchline: uptime has been solid, and the monthly bill dropped dramatically.
It was a great deal because I cut the cost of more than 95%. To be honest.
They did not stop at pages. Editors now ship landing pages through a “builder” made of roughly fifty blocks. Idea to production can be hours. The same structure was cloned for their Spain rollout. That speed has changed how they work and how often content ships.
BUT.
They also didn't just stop at websites. Behind the doors, they've been working with a pretty nifty Java client for DatoCMS that lets them run CRM operations from the CMS itself.
Into the backend
Ok, before getting into the nitty-gritties. The TLDR here is that the team at Azimut Marketplace uses the CMS to create emails that are served to their users via a Java client for deliverability rather than using something like Mailchimp. They also have the possibility to localize the content for their targeting, and how the same logic applies for communication notifications.
The move was simple in concept. Treat backend messages like content. Put the mutable parts in Dato, keep the envelope and delivery in code, and fetch the right localized body at send time. Lorenzo explains the motivation without ceremony. “And of course, back-end sends email. And we do not want to do a deploy every time the marketing department or the content department says, OK, I have a new idea. I want to change this email.”
The interesting work though is on the server side - backend systems also talk to users and need localized strings that change often. The team moved email bodies and other server-side copy into Dato so marketing can edit without a deploy. The HTML body lives in Dato. A standard template lives in code. The service stitches them and hands the result to a cloud mail sender. The CRM keeps being a CRM. Dato holds the words.
And of course, back-end sends email. And we don't want to do a deploy every time the marketing department or the content department says, OK, I have a new idea. I want to change this email.
Concretely, delivery is "boring" by design. They use the mail service that exists in their cloud, and the Dato integration is strictly for content. That separation keeps the architecture simple, cheap, and testable. Not to mention extremely flexible, for when new ideas pop up, something we discuss later.
Why a Java client?
Azimut Marketplace runs Java with Quarkus. There was no JVM client for Dato, so they wrote one and open sourced it to Maven Central. It is intentionally small. The SDK does not invent new concepts. It takes a GraphQL query string, forwards it, gives you a response, and gets out of the way. Types are your job, which is exactly how most Java teams want it for backend services.
You install it from Maven central. So it is a Maven install or you can go to Maven central, hit DatoCMS Java client or DatoCMS Java SDK. Take the pom.xml code or even Gradle code. You write it in your code base and then you install the dependency. Done.
From there you instantiate sdk-client, inject the API key, and call one of the “get data from datocms” methods with a project and an arbitrary GraphQL query. The only discipline you need is to define your response types and regenerate them when schemas change. Localization is expressed in the query, not hidden behind magic.
That rawness is on purpose.
How its being used
Push notifications are the canonical case for server-side localization. Devices render what they are given. There is no translation stage on the client. That means you fetch the right language from Dato before you call Apple or Google. Technically, should they expand the communication range, they'd have the same story for SMS. Same story for voice prompts in multi-factor flows where a provider does text-to-speech. All of these become safer and faster if they read from the same content source as the website.
Honestly, its' a pretty zippy and nifty service as it is, but with the scope for expanding it based off of such a nice simple package, the potential use-cases are kinda dope.
DX in practice
The DX is about as minimal as you can make it.
Add a dependency, create a client, pass a query, map the response.
If the schema changes, regenerate your types where it matters.
If you need locale variants, put them in the query.
If you need different backends to share a content shape, share the fragment.
The result is a backend that can speak to users across channels without a parade of tools, and an editing flow that never pings engineering for copy changes.
Azimut Marketplace built and released the client because it solved their problem and because they wanted it to be useful beyond their walls. They plan to evolve it as their own needs evolve, but only in ways that stay generic enough for the community. That constraint keeps the SDK lean and stops it from turning into a product of its own.