Application Guide
How to Apply for Research Engineer
at Far.Ai
๐ข About Far.Ai
Far.Ai is a non-profit AI research institute uniquely positioned to conduct cutting-edge safety research at a scale surpassing academia, with the freedom of a non-profit and publication at top conferences like NeurIPS and ICML. Working here means contributing directly to ensuring advanced AI is safe and beneficial for everyone, alongside a rapidly growing team of 30+ researchers and policy experts.
About This Role
As a Research Engineer, you will bridge the gap between theoretical AI safety research and practical implementation, building tools, running experiments, and engineering systems that advance our understanding of AI risks. Your work will directly support in-house research and grant-funded projects, enabling breakthroughs that shape global AI safety efforts.
๐ก A Day in the Life
A typical day might involve coding up a new interpretability technique in PyTorch, collaborating with researchers via Slack or Notion to debug experiments, and writing documentation for internal tools. You might also attend a remote all-hands meeting or a paper discussion group, then spend the afternoon running large-scale evaluations on a cluster.
๐ Application Tools
๐ฏ Who Far.Ai Is Looking For
- Strong software engineering skills with experience in Python and deep learning frameworks (PyTorch, JAX, TensorFlow).
- Familiarity with AI safety research topics such as alignment, interpretability, or robustness, and a passion for contributing to these areas.
- Ability to work independently in a remote setting, with excellent communication skills to collaborate with a distributed team.
- Demonstrated experience in building research infrastructure, running large-scale experiments, or contributing to open-source AI projects.
๐ Tips for Applying to Far.Ai
Tailor your resume to highlight any AI safety-related projects, even informal ones like participation in alignment workshops or reading groups.
Include links to your GitHub or portfolio with concrete examples of engineering work that supported research (e.g., reproducibility scripts, experiment pipelines).
Mention any familiarity with FAR.AI's published work or the Guaranteed Safe AI roadmap in your cover letter.
Given the remote nature, emphasize your self-management and async communication skills.
Apply early; the deadline is Jan 13, 2026, but positions at non-profits can fill quickly.
โ๏ธ What to Emphasize in Your Cover Letter
["Explain your motivation for working in AI safety and why you specifically want to contribute to FAR.AI's mission.", 'Highlight your engineering contributions to research projects, especially any that improved efficiency or reproducibility.', "Demonstrate understanding of FAR.AI's three-pronged approach (Research, Futures, Labs) and how your role fits.", 'Mention any relevant experience with non-profit or mission-driven organizations.']
Generate Cover Letter โ๐ Research Before Applying
To stand out, make sure you've researched:
- โ Read FAR.AI's recent publications on their website (e.g., on interpretability or safety cases).
- โ Understand the Guaranteed Safe AI roadmap co-authored with Yoshua Bengio.
- โ Look into FAR.AI's grant-making process and how research engineers support grantees.
- โ Familiarize yourself with the San Diego Alignment Workshop and its outcomes.
๐ฌ Prepare for These Interview Topics
Based on this role, you may be asked about:
โ ๏ธ Common Mistakes to Avoid
- Submitting a generic cover letter that doesn't mention AI safety or FAR.AI's specific work.
- Overemphasizing academic credentials without showing practical engineering impact.
- Ignoring the non-profit aspectโdon't focus solely on compensation or prestige.
๐ Application Timeline
This position is open until filled. However, we recommend applying as soon as possible as roles at mission-driven organizations tend to fill quickly.
Typical hiring timeline:
Application Review
1-2 weeks
Initial Screening
Phone call or written assessment
Interviews
1-2 rounds, usually virtual
Offer
Congratulations!