Job Title: T&T Engineering| Data Analyst | Consultant
1) Strong programming skills in Python (pandas, asyncio, requests, FastAPI/Flask)
2) Advanced PySpark experience for distributed data processing
3) Proven experience in developing data agents (e.g., LangChain, LLM-based agents, workflow automation frameworks, or custom-built agents)
4) Experience with cloud platforms (Azure, AWS, or GCP) and their data services
5) Proficiency in SQL and working with relational and NoSQL databases
6) Familiarity with orchestration tools (Airflow, Prefect, or similar)
7) Knowledge of data security, governance, and compliance frameworks
Good to Have:
1) Exposure to Generative AI / LLM integration in data workflows
2) Knowledge of messaging/streaming platforms (Kafka, Kinesis)
3) CI/CD and Infrastructure as Code experience