About the Company
Cortex is an AI-powered platform built for real estate investors and asset managers. It transforms fragmented financial, technical, and commercial data into clear dashboards and predictive insights-helping real estate professionals optimize performance, reduce costs, and make smarter, faster decisions.
Designed for strategic decision-making in real estate, Cortex uses advanced AI to make complex analysis and insights generation.
Role overview
Cortex is an AI-powered platform built for real estate investors and asset managers. It transforms fragmented financial, technical, and commercial data into clear dashboards and predictive insights-helping real estate professionals optimize performance, reduce costs, and make smarter, faster decisions.
Responsibility Area
| Responsibility Area | Key Actions |
|---|---|
| LLM/Generative AI Development | Design, implement, and optimize LLM-powered features leveraging GPT agents and advanced RAG architectures to deliver predictive insights and automated workflows. |
| Custom Model Development | Develop, train, and fine-tune novel machine learning and deep learning models from scratch using frameworks like PyTorch or TensorFlow, specifically targeting real estate performance optimization and risk assessment. |
| Data Engineering & Preparation | Own the end-to-end data pipeline, including sourcing, cleaning, transformation, and optimization of large, complex, structured, and unstructured real estate datasets for model consumption. |
| Research & Innovation | Conduct thorough research on state-of-the-art AI/ML methodologies (e.g., time-series forecasting, graph neural networks) and evaluate their applicability to real estate investment challenges. |
| Model Deployment & MLOps | Collaborate with MLOps and engineering teams to integrate trained models into production environments, ensuring scalability, low latency, and continuous performance monitoring. |
| Performance & Iteration | Define key performance indicators (KPIs) for AI features, monitor model performance post-deployment, and implement iterative improvements based on A/B testing and data drift analysis. |
Requirements
- Strong Python programming expertise with a minimum of 3 years of hands-on experience developing production-grade AI/ML solutions.
- Deep, demonstrable experience with the practical application of GPT agents and LLM frameworks (e.g., LangChain, custom OpenAI API integrations) for complex task automation.
- Essential knowledge of advanced Retrieval-Augmented Generation (RAG) techniques, including vector indexing, chunking strategies, and proficiency with vector databases (e.g., Pinecone, Chroma).
- Expert proficiency in handling, querying (SQL), cleaning, transforming, and feature engineering large, multi-modal datasets.
- Proven experience in model lifecycle management, including training, validation, hyperparameter tuning, and fine-tuning custom models using PyTorch, TensorFlow, or similar deep learning frameworks.
- Familiarity with MLOps principles and tools for model versioning, deployment, inference optimization, and monitoring (e.g., MLflow, Kubeflow).
- Demonstrated capacity for independent, hypothesis-driven problem-solving and translating ambiguous business questions into rigorous data science projects.
Nice to Have
- Direct domain knowledge or experience with financial modeling, real estate analytics, or time-series data analysis.
- Experience building scalable data pipelines using tools like Spark, Kafka, or Airflow.
- Practical cloud deployment experience with AI services on platforms such as AWS (Sagemaker), GCP (Vertex AI), or Azure.
- Understanding of data governance, privacy, and security best practices specific to sensitive financial data.