Staff Full-Stack Engineer

Location: Remote in Brazil, Colombia, Mexico, Chile
Must be able to work PST Hours

Our team is building a next-generation Cloud Demand Forecasting Tool to replace legacy spreadsheet-driven processes and fragmented portals used to plan infrastructure capacity across Baremetal & 3PC services. This is a greenfield modernization effort, you will architect and build a scalable, maintainable platform that enables 500+ concurrent users across dozens of lines of business to submit, approve, aggregate, and actualize infrastructure demand forecasts on a rolling horizon. You will own the migration from legacy tooling and deliver a system that reduces manual coordination effort by 80%, accelerates planning cycles from weeks to days, and achieves greater than 90% forecast accuracy within a specified window.

Key Responsibilities
  • Design and build the end-to-end platform: web portal, RESTful APIs, CLI, and data pipelines for cloud infrastructure demand forecasting and lifecycle management.

  • Implement complex, configurable workflow engines supporting a multi-stage demand forecast lifecycle with SLA tracking, automated routing, and notification triggers.

  • Build a demand normalization and validation layer that standardizes hardware SKUs, translates bare metal demand into sellable/vendable units, and validates submissions against real-time constraints (data center power/space, budget, lead times).

  • Develop hierarchical forecast aggregation across business units, geographies, resource types, time horizons, and scenario types (baseline, stretch, low-case).

  • Create executive dashboards and reporting modules with drill-down capabilities, variance tracking (forecast vs. actual), exception queue management, and multi-format output (CSV, JSON, Excel, API).

  • Implement predictive analytics capabilities: time series analysis, seasonal decomposition, ML-based demand prediction, and confidence interval quantification.

  • Architect for scale and reliability.

  • Own security and access control: SSO integration, granular RBAC, encryption at rest and in transit, complete audit logging with observability integration.

Required Qualifications

  • 7+ years of software engineering experience, with at least 3 years building enterprise-grade internal tools or planning/forecasting platforms.

  • Strong experience migrating from legacy systems (spreadsheets, fragmented portals) to modern, consolidated web applications.

  • Deep expertise in full-stack development, building both the backend services and frontend interfaces end-to-end.

  • Proven track record designing and implementing complex stateful workflow engines with multi-level approval chains, configurable routing, and SLA enforcement.

  • Experience building RESTful APIs with versioning, rate limiting, comprehensive error handling, and interactive documentation (OpenAPI/Swagger).

  • Strong data engineering skills: ETL pipelines, data normalization, validation frameworks, and integration with data lakes.

  • Experience with role-based access control systems and enterprise SSO (SAML/OIDC).

  • Solid understanding of relational database design, query optimization, and data archival strategies.

Preferred Qualifications

  • Experience in infrastructure capacity planning, cloud resource management, or supply chain forecasting domains.

  • Familiarity with predictive analytics, time series forecasting, seasonal decomposition, scikit-learn or equivalent ML frameworks.

  • Experience building CLI tools that operate against platform APIs.

  • Background working with financial planning workflows, budget approval chains, or fiscal tracking systems.

  • Experience with observability and audit logging pipelines (Splunk, Grafana, or equivalent).

  • Knowledge of deep learning architectures for time series forecasting, particularly LSTM (Long Short-Term Memory) networks, transformer-based models (e.g., Temporal Fusion Transformers), or Prophet for demand prediction and seasonal pattern recognition at scale.

Tech Stack

Frontend: React, TypeScript, modern component libraries; data visualization with D3.js/Recharts for interactive dashboards

Backend: Python (FastAPI/Django) with service-layer architecture for workflow orchestration, validation, and aggregation

APIs: RESTful services with OpenAPI/Swagger

CLI: Python (Click/Typer) or Go-based CLI for API interaction and workflow automation

Database & Storage: PostgreSQL for transactional/workflow data; integration with Data Lake.

Data Pipelines: Apache Airflow (or equivalent) for ETL, normalization, validation, and scheduled forecasting workflows

Analytics/ML: Python (pandas, scikit-learn, statsmodels) for time series forecasting, seasonal analysis, and demand modeling (MAPE-driven). Knowledge of advanced time series models like LTSM (Long Short-Term Memory) is a plus.

Security: SSO (SAML 2.0/OIDC), RBAC, TLS (in transit), AES-256 (at rest), and DLP controls

Messaging: Kafka (or equivalent) for event-driven workflows; notifications via Email, Slack, Unibox

Caching: Redis for sessions and frequently accessed data

Observability: Logging and monitoring via Splunk/Grafana, APM, SLA dashboards

Infrastructure: Kubernetes-based microservices, CI/CD pipelines, automated backups

File I/O: CSV, Excel (openpyxl/Apache POI), JSON for bulk import/export