Analytics Engineer Freelance
Location: Paris (on-site)
Contract: Freelance
Duration: 3 months · 4 days/week
Start: ASAP
Languages: French and English (fluent in both)
About Colonies
Colonies is on a mission to solve housing. We build and manage flexible living spaces across Europe. The tech team is small and high-leverage. We run a hub-and-spoke model: a Core Tech team maintains the platform, while embedded engineers position themselves in business teams to solve problems close to the source. What works gets hardened and scaled. What doesn’t gets killed quickly.
The job
You will embed directly in our Assets team to automate investor reporting that is currently done by hand. The first mission is CRG reporting: understand the manual process, identify every data source involved, get them clean and modelled in our dbt/BigQuery stack, and build the reporting apps on top — most likely in Hex — customised per investor where needed.
Concretely, expect to:
Map the current manual CRG reporting workflow with the Assets team and identify gaps in available data
Add, clean, and model new data sources into our dbt transformation layer (Stripe, Qonto, Pipedrive, MS Postgres, Zendesk, among others)
Build shared dbt marts that serve as the backbone for all downstream analytics
Develop CRG reporting apps and investor-facing dashboards in Hex, tailored per investor where required
Reconcile data across sources — financial transactions, occupancy, operational costs — and build the tests to keep it trustworthy
Set up alerting and data quality monitoring (dbt tests, Elementary → Slack)
Once CRG reporting is live, the role extends to other high-impact data projects: budget dashboards, a KPI data room, and whatever the business needs next.
You will report into the Core Tech team for standards and tooling, but your day-to-day will be spent with the Assets team. The feedback loop is short: build, ship, watch someone use it, iterate.
What this is not
This is not a BI analyst role where you receive specs and build charts. We need someone who can walk into a team, understand their reporting pain end-to-end, wrangle the data engineering, and ship the finished product. You own the problem from source data to investor-ready output.
You
You are strong in SQL and dbt. You have built and maintained transformation layers in a production analytics stack — ideally on BigQuery or a similar warehouse.
You have experience building data apps or interactive reports. Hex experience is a plus; similar tools (Streamlit, Retool dashboards, Metabase, Looker) count too.
You can write Python for data wrangling, reconciliation scripts, and light automation.
You have done data reconciliation work — matching transactions across systems, finding discrepancies, building processes to keep data aligned.
You understand the analytics engineering workflow: sources → staging → marts → consumption layer, with testing and documentation baked in.
You use AI tools (Copilot, Claude, Cursor, etc.) as part of your daily workflow, not as a novelty.
You can sit with non-technical stakeholders, understand their reporting needs, and translate that into a data model and a finished app — not a slide deck.
You are comfortable with ambiguity. The current process is manual and messy. You will need to reverse-engineer it before you can automate it.
You have a bias for action and a proven track record. You can point to data products you have built that people actually use.