⬅️  To Main Snowflake jobs

Create flexible and precise queries that fit your needs exactly. Example: React.js, -USA  × Laravel, Vue.js, -Contract  × will get you jobs that are (React.js and not in USA) or (Laravel and Vue.js and not Contract/Freelance).

You can mix and match any tags, negations and groups in any order. And don't worry about typos – the search is fuzzy.

Dismiss

Zark Lab Remote
About the Team: Zark Lab is building foundation models for blockchain transactions and information. Our work focuses on search, retrieval, and generative modeling applied to on-chain and off-chain data. We are developing systems that enable efficient indexing, retrieval-augmented generation (RAG), and vector search for structured and unstructured blockchain datasets. Our models process billions of transactions and smart contracts across multiple blockchains, applying sequence modeling, graph-based learning, and language models to extract insights and improve data accessibility. The team consists of former founders, senior engineers, and executives from Google, Meta, Goldman Sachs, and other leading technology and financial institutions. Our backgrounds span large-scale distributed systems, machine learning, engineering, and information retrieval, and we are focused on advancing the state of AI-driven search and computation for blockchains.

About the Team:

Zark Lab is building foundation models for blockchain transactions and information. Our work focuses on search, retrieval, and generative modeling applied to on-chain and off-chain data. We are developing systems that enable efficient indexing, retrieval-augmented generation (RAG), and vector search for structured and unstructured blockchain datasets. Our models process billions of transactions and smart contracts across multiple blockchains, applying sequence modeling, graph-based learning, and language models to extract insights and improve data accessibility.

The team consists of former founders, senior engineers, and executives from Google, Meta, Goldman Sachs, and other leading technology and financial institutions. Our backgrounds span large-scale distributed systems, machine learning, engineering, and information retrieval, and we are focused on advancing the state of AI-driven search and computation for blockchains.

What you will do:

  • Build large-scale web scraping and ingestion pipelines for on-chain and off-chain blockchain data
  • Develop and optimize search architectures, integrating vector search, ANN retrieval, and ranking models
  • Fine-tune LLMs for query expansion, semantic search, and retrieval-augmented generation (RAG)
  • Reduce query latency through index optimization, ANN search, and distributed execution
  • Scale distributed indexing pipelines for efficient storage, deduplication, and retrieval
  • Optimize distributed storage and compute with Snowflake, ClickHouse, RocksDB, and vector databases
  • Build scalable systems to process high-throughput blockchain transactions and queries
  • Deploy and optimize cloud workloads on GCP with Kubernetes and containerized processing

You might thrive in this role if you:

  • BS/MS/PhD in Computer Science or a related field.
  • 5+ years of experience in AI/ML, distributed search, or large-scale data processing.
  • Strong programming skills in Python, TypeScript, or Node.js.
  • Expertise in database design (SQL and NoSQL) and high-throughput data systems.
  • Experience with web crawling, data scraping, and large-scale ingestion pipelines.
  • Knowledge of vector search, retrieval-augmented generation (RAG), and embedding models (preferred).
  • Hands-on experience with GCP, Kubernetes, and Docker for cloud-scale deployment.
  • Passion for blockchain, AI search, and distributed systems.

Why join us?

  • Work on cutting-edge generative AI and search technologies applied to blockchain
  • Solve complex challenges in large-scale indexing, vector search, and AI-powered retrieval
  • Be part of a high-caliber team of former founders, engineers, and executives from leading tech and financial firms
  • Competitive salary and equity opportunities
  • Fully remote team with a fast-moving, high-impact culture
  • Own and shape the future of AI-driven search and retrieval for Web3

If you’re excited about search, generative AI, and real-time blockchain inference, we’d love to hear from you.

Apply now or reach out for more details.

Permalink

Biconomy Remote
Biconomy empowers Web3 developers to build seamless, user-friendly dApps that work effortlessly across multiple blockchains. Our battle-tested modular account and execution stack eliminates traditional UX friction points, helping projects accelerate user adoption while reducing development costs. By processing over 50 million transactions across the 300+ dApps we’ve served, we’re powering the future of onchain economies. The Role: Innovating at the Intersection of AI and DeFi We are assembling a world-class team to redefine on-chain analytics using AI Agents & Machine Learning. As a DevOps Engineer, you will build and optimize the AI & ML infrastructure, enabling real-time wallet activity analysis, ML-driven tagging, and PnL insights at scale. You’ll work with Kafka, Snowflake, AWS S3, and high-speed data pipelines to process over 10-15 billion rows of historical data and real-time streaming events, ensuring a scalable, secure, and efficien

Biconomy empowers Web3 developers to build seamless, user-friendly dApps that work effortlessly across multiple blockchains. Our battle-tested modular account and execution stack eliminates traditional UX friction points, helping projects accelerate user adoption while reducing development costs. By processing over 50 million transactions across the 300+ dApps we’ve served, we’re powering the future of onchain economies.

The Role: Innovating at the Intersection of AI and DeFi

We are assembling a world-class team to redefine on-chain analytics using AI Agents & Machine Learning. As a DevOps Engineer, you will build and optimize the AI & ML infrastructure, enabling real-time wallet activity analysis, ML-driven tagging, and PnL insights at scale. You’ll work with Kafka, Snowflake, AWS S3, and high-speed data pipelines to process over 10-15 billion rows of historical data and real-time streaming events, ensuring a scalable, secure, and efficient AI ecosystem.

This is a high-impact role at the core of our AI-driven crypto intelligence platform.

What Will You Be Doing?

Scalable AI & ML Infrastructure

  • Design & optimize cloud-native architectures for AI & ML-driven analytics on DeFi transactions.
  • Develop and maintain high-performance, distributed computing environments that process billions of on-chain and off-chain events.
  • Deploy and manage ML models & AI agents efficiently in Kubernetes (K8s) and other containerized environments.

High-Speed Data Engineering

  • Design real-time streaming pipelines using Kafka for high-frequency on-chain transaction ingestion.
  • Optimize Snowflake queries & storage solutions to handle large-scale blockchain datasets efficiently.
  • Implement ETL/ELT pipelines for structured & unstructured blockchain data aggregation.

Infrastructure Automation & Reliability

  • Automate cloud infrastructure using Terraform, Pulumi, or CloudFormation for seamless scaling.
  • Enhance security & performance with CI/CD best practices for AI model deployment.
  • Implement observability, logging & monitoring with tools like Prometheus, Grafana, and Datadog.

ML & AI Model Deployment

  • Streamline model deployment for AI agents that analyze blockchain wallet behavior.
  • Build infrastructure to support model training, inference, and real-time decision-making.
  • Optimize GPU workloads for AI-driven pattern recognition & risk analysis.

Collaboration Across Teams

  • Work alongside AI engineers, blockchain developers, and data scientists to integrate AI-driven insights into DeFi tools.
  • Optimize node infrastructure & RPC services for multi-chain DeFi interactions.
  • Research and implement best practices in Web3 DevOps, cloud automation, and ML infrastructure.

Requirements:

Core Experience

  • 5+ years of DevOps experience, specializing in AI/ML infrastructure & high-scale data pipelines.
  • Expertise in AWS, GCP, or Azure, focusing on scalable, event-driven architectures.
  • Experience handling Kafka queues, Snowflake databases, and massive-scale data processing.

DevOps & Infrastructure Skills

  • CI/CD Automation using GitHub Actions, CircleCI, Jenkins, or ArgoCD.
  • Kubernetes & Docker for managing AI workloads and high-performance model inference.
  • Infrastructure-as-Code (IaC): Terraform, Pulumi, or CloudFormation.

Data Engineering & AI Model Ops

  • Experience with Kafka, Snowflake, and S3 for real-time & historical data processing.
  • ML Model Deployment: Knowledge of TensorFlow, PyTorch, or ONNX for AI-based wallet analytics.
  • Strong Python & Bash scripting for automation and orchestration.

Security & Reliability

  • Security-first mindset: Experience with IAM, firewall management, and Web3 security best practices.
  • Observability & Monitoring using Prometheus, Grafana, Datadog, or OpenTelemetry.

Bonus Skills

  • Experience with multi-chain blockchain infrastructure (Ethereum, Solana, L2s).
  • Knowledge of DeFi protocols, smart contracts, and risk analytics.
  • Experience with AI inference at scale (Ray, Triton, Hugging Face Transformers, etc.)

What We Offer:

  • Flexible Working Hours: Enjoy autonomy over your schedule.
  • Generous Vacation Policy: 25 days vacation per year plus public holidays
  • Competitive Salary: With regular performance reviews.
  • Token Allocation: Be rewarded with tokens as part of our compensation package.
  • Growth Opportunities: Be part of an exciting new project with significant career growth potential.
  • Innovative Work Culture: Join a team that’s at the cutting edge of Web3, AI, and DeFi, and help shape the future of the digital economy.
  • Fun and Engaging Team Activities: Game nights, virtual celebrations, and work retreats to keep things exciting.

At Biconomy, we believe in creating a diverse and inclusive workplace. We are committed to being an equal-opportunity employer, and we do not discriminate based on race, national origin, gender, gender identity, sexual orientation, disability, veteran status, age, or any other legally protected status.

Permalink