Create flexible and precise queries that fit your needs exactly. Example:
React.js, -USA
×
Laravel, Vue.js, -Contract
×
will get you jobs that are
(React.js
and not in USA
) or
(Laravel
and Vue.js
and not Contract/Freelance
).
You can mix and match any tags, negations and groups in any order. And don't worry about typos – the search is fuzzy.
🧩 Senior Full-Stack Developer – Node.js & React (Contractor, Remote) Kick off your next career move with Launchpad! We're looking for a seasoned full-stack developer ready to make an impact in a fast-paced, product-driven environment. If you’re passionate about building scalable applications with modern JavaScript frameworks, this might be the opportunity for you.
🗓 Start date: ASAP 📆 Contract type: Contractor, Long-term 🌐 Work hours: 7.30 to 16.30 PST 🏡 Work mode: 100% remote
🛠️ What You’ll Be Doing
Build state-of-the-art applications using modern JavaScript (Node.js, React) to deliver seamless user experiences across platforms.
Collaborate with UI/UX designers and specialists to implement responsive, adaptable frontends.
Work closely with architects and product managers to translate features into actionable plans, ensuring business and user needs are met.
Participate in the design and execution of scalable methodologies for high-quality software delivery.
Contribute to documentation and testing processes to support maintainability and team-wide clarity.
Take ownership of full lifecycle development: from design to deployment, including debugging and performance tuning.
✅ What You Need to Succeed
Must-haves
8+ years of experience in web application development (JavaScript, HTML, CSS, React, Node.js).
Solid experience with single-page applications and microservices architectures.
Strong skills in building RESTful APIs using Node.js frameworks like Express or Sails.js.
Experience with Docker, AWS, and cross-browser/multi-device development.
Familiarity with GitLab pipelines and Agile methodologies (Scrum/Kanban).
Excellent communication and collaboration skills in English (Advanced level).
Availability to work in a remote, distributed team across the Americas.
Nice-to-haves
Working knowledge of SQL queries, Tableau, and Snowflake.
Familiarity with Splunk or similar observability tools.
Experience ensuring compliance with security, privacy, and regulatory standards.
Previous work in product companies or startups is a plus.
🧭 Our Recruitment Process Here’s what to expect from our candidate-friendly interview process:
Initial Interview – 60 minutes with our Talent Acquisition Specialist Culture Fit – 30 minutes with our Team Engagement Manager Take Home Test Final Stage – 60 minutes with the Hiring Manager - Technical Interview
🌟 Why Join Launchpad? We believe that great work starts with great people. At Launchpad, we offer:
💻 Fully remote work with hardware provided
🌎 Global team experience with clients around the globe
💸 Competitive USD compensation
📚 Training and learning stipends
🌴 Paid Time Off (vacation, personal, study)
🧘♂️ A culture that values autonomy, purpose, and human connection
✨ Ready to make your mark? Apply now and be part of something exciting.
Apply Now:
About Zscaler Serving thousands of enterprise customers around the world including 40% of Fortune 500 companies, Zscaler (NASDAQ: ZS) was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users. As the operator of the world’s largest security cloud, Zscaler accelerates digital transformation so enterprises can be more agile, efficient, resilient, and secure. The pioneering, AI-powered Zscaler Zero Trust Exchange™ platform, which is found in our SASE and SSE offerings, protects thousands of enterprise customers from cyberattacks and data loss by securely connecting users, devices, and applications in any location. Named a Best Workplace in Technology by Fortune and others, Zscaler fosters an inclusive and supportive culture that is home to some of the brightest minds in the industry. If you thrive in an environment that is fast-paced and collaborative, and you are passionate about building and innovating for the greater good, come make your next move with Zscaler. Our Engineering team built the world's largest cloud security platform from the ground up, and we keep building. With more than 100 patents and big plans for enhancing services and increasing our global footprint, the team has made us and our multitenant architecture today's cloud security leader, with more than 15 million users in 185 countries. Bring your vision and passion to our team of cloud architects, software engineers, security experts, and more who are enabling organizations worldwide to harness speed and agility with a cloud-first strategy. We're looking for an experienced Senior Software Development Engineer to join our Team. Reporting to the Engineering Manager, you'll be responsible for:
Designing, analyzing, and troubleshooting large-scale distributed systems Contributing in continuous monitoring, vulnerability scanning, patching, and reporting of the system
What We're Looking for (Minimum Qualifications)
2+ years of public cloud experience (AWS or GCP or Azure) and Kubernetes Expertise in designing, analyzing, and troubleshooting large-scale distributed systems Experience with Infrastructure as Code and programming languages like Python, Java Experience in data engineering, Spark, DBT, temporal, SQL etc is a plus but not must have
What Will Make You Stand Out (Preferred Qualifications)
Experience in continuous monitoring, vulnerability scanning, patching, and reporting Experience with multiple cloud providers (AWS, Azure) and both relational and non-relational databases for microservices Bachelor's degree in Science, Engineering, IT, or equivalent
Our Benefits program is one of the most important ways we support our employees. Zscaler proudly offers comprehensive and inclusive benefits to meet the diverse needs of our employees and their families throughout their life stages, including:
Various health plans Time off plans for vacation and sick time Parental leave options Retirement options Education reimbursement In-office perks, and more!
By applying for this role, you adhere to applicable laws, regulations, and Zscaler policies, including those related to security and privacy standards and guidelines. Zscaler is committed to providing equal employment opportunities to all individuals. We strive to create a workplace where employees are treated with respect and have the chance to succeed. All qualified applicants will be considered for employment without regard to race, color, religion, sex (including pregnancy or related medical conditions), age, national origin, sexual orientation, gender identity or expression, genetic information, disability status, protected veteran status, or any other characteristic protected by federal, state, or local laws. See more information by clicking on the Know Your Rights: Workplace Discrimination is Illegal link. Pay Transparency Zscaler complies with all applicable federal, state, and local pay transparency rules. Zscaler is committed to providing reasonable support (called accommodations or adjustments) in our recruiting processes for candidates who are differently abled, have long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support.
Apply Now:
CoW DAO is on a mission to protect Ethereum users from MEV and optimize trade execution across DeFi. We achieve this through the CoW Protocol, CoW Swap (a leading intent-based DEX aggregator), and the innovative MEV Blocker, which together help secure, aggregate, and route trades for optimal outcomes. We also fund values-aligned projects via the CoW Grants Program.
CoW Protocol is consistently ranked among the top DEX aggregators by monthly volume and is the largest intent-based exchange. Our MEV Blocker protects trades from harmful MEV extraction and is integrated across the Ethereum ecosystem. The CoW AMM is the only live AMM designed to protect liquidity providers from LVR (loss-versus-rebalancing).
With over 100 open-source repositories on GitHub, we’re transparent, community-driven, and deeply committed to the open-source ethos. Our real-time Dune Analytics dashboard showcases billions in cumulative trading volume and a rapidly growing user base. As we continue to scale, CoW DAO remains at the forefront of DeFi innovation, prioritizing security, efficiency, and decentralization.
At CoW DAO data plays a central role in how we understand on-chain activity and operate our fair combinatory auction system.
We’re looking for a Data Engineer who’s excited to dive deep into blockchain data (Ethereum and other L1/L2s), enrich it with insights from our off-chain auction system, and build the pipelines and infrastructure to support operations and analytics. This is a hands-on role where you’ll work across teams, design scalable systems, and make high-quality data accessible and actionable across the organization.
If you thrive in a fast-moving environment, love working with complex datasets, and want to shape the data foundation of a leading DeFi protocol, we’d love to hear from you.
Earn 4,000 USDC/USD through our refer-to-earn program. More details here
Life within the CoW Protocol is an incredible adventure! We take pride in our collaborative approach, embracing autonomy and fostering a culture of big thinking and continuous growth. We value impact, ownership, simplicity, and team spirit. Plus, we’re all about feedback, coming together, and enjoying the journey along the way!
At CoW Protocol, we strive to create a space where everyone feels included and empowered. We believe that our products and services benefit from our diverse backgrounds and experiences. All qualified applicants are considered for positions regardless of race, ethnic origin, age, religion or belief, marital status, gender identification, sexual orientation, or physical ability.
Join the Future of Crypto Investing!
Are you passionate about blockchain, cryptocurrencies, and building revolutionary platforms? CRYLO® is a cutting-edge FinTech startup revolutionizing DeFi with Artificial Intelligence (AI) and Machine Learning (ML) algorithms. We are building a platform that simplifies crypto investing for the masses, from market analysis, to personalized crypto portfolio creation, and fully automated trading.
Join this ambitious team as our Lead Blockchain Developer to shape the future of decentralized finance!You’ll work closely with the CTO, Quant and fontend developers to create an intuitive experience for investors entering the DeFi space.
This is your chance to be part of a startup that’s transforming decentralized finance (DeFi) while working closely with an inspiring team. Join us and help develop the backend of our MVP for the world’s first AI-driven wealth management platform.
If you’re ready to take your blockchain and backend development skills to the next level and be part of a groundbreaking project, we’d love to hear from you!
Please note that we only accept applications directly from individuals and not through staffing firms, and a complete application must include a resume and a list of links to your recent project references.
Apply Now:
Chorus One is one of the leading operators of infrastructure for Proof-of-Stake networks and decentralized protocols. Tens of thousands of retail customers and institutions are staking billions in assets through our infrastructure helping to secure protocols and earn rewards. Our mission is to increase freedom and speed of innovation through decentralized technologies.
We are a diverse team of around 75 people distributed all over the globe. We value radical transparency, striving for excellence and improvement while treating each other with kindness and generosity. If this sounds like you, we’d love to hear from you.
As a Data Engineer, you will be responsible for the design, implementation, and maintenance of data pipelines that power user-facing products as well as internal tools. You will integrate with internal blockchain RPC nodes and third-party APIs to fetch stake and rewards data for multiple blockchains, and store and process this data to power internal reports and dashboards, as well as external customer reporting. You will play a critical role in enabling data-driven decision-making.
Our current data pipelines are implemented in Python, and run on a mix of Apache Airflow and Kubernetes, storing data in Postgres and Google Cloud Storage. We use various dashboarding and analysis tools, including Streamlit.
What we are looking for: