{"id":62773,"date":"2025-09-16T16:58:33","date_gmt":"2025-09-16T16:58:33","guid":{"rendered":"https:\/\/opportunitiesforyouth.org\/?p=62773"},"modified":"2025-09-16T16:58:33","modified_gmt":"2025-09-16T16:58:33","slug":"constellation-astra-fellowship-2026-fully-funded-in-person-program-advancing-ai-safety-research","status":"publish","type":"post","link":"https:\/\/opportunitiesconnect.com\/?p=62773","title":{"rendered":"Constellation Astra Fellowship 2026: Fully Funded In-Person Program Advancing AI Safety Research"},"content":{"rendered":"<p data-start=\"386\" data-end=\"863\">The <strong data-start=\"390\" data-end=\"424\">Constellation Astra Fellowship<\/strong> is a fully funded, 3\u20136 month, in-person program at Constellation\u2019s Berkeley research center, designed to accelerate <strong data-start=\"541\" data-end=\"563\">AI safety research<\/strong> and talent development. This flagship fellowship aims to bring exceptional individuals into the field of AI safety, connect them with leading mentors, and support high-impact research projects that address urgent and neglected challenges in AI governance, policy, security, and empirical research.<\/p>\n<p data-start=\"865\" data-end=\"1406\">As artificial intelligence advances at unprecedented speed, the Astra Fellowship seeks to prepare researchers and practitioners to tackle some of the most critical risks associated with frontier AI technologies. Over 80% of Astra\u2019s first cohort have transitioned into full-time roles at organizations including <strong data-start=\"1176\" data-end=\"1272\">Redwood Research, METR, Anthropic, OpenAI, Google DeepMind, and the UK AI Security Institute<\/strong>. The program not only places fellows into high-impact roles but also supports the launch of new initiatives in the AI safety space.<\/p>\n<hr data-start=\"1408\" data-end=\"1411\" \/>\n<h2 data-start=\"1413\" data-end=\"1434\">Program Overview<\/h2>\n<p data-start=\"1436\" data-end=\"1695\">The Astra Fellowship is an in-person program based at Constellation\u2019s Berkeley research center. Fellows work on <strong data-start=\"1548\" data-end=\"1579\">frontier AI safety projects<\/strong> with guidance from expert mentors and dedicated support from Constellation\u2019s research management and talent team.<\/p>\n<p data-start=\"1697\" data-end=\"1979\"><strong data-start=\"1697\" data-end=\"1718\">Program Duration:<\/strong> 3\u20136 months<br data-start=\"1729\" data-end=\"1732\" \/><strong data-start=\"1732\" data-end=\"1745\">Location:<\/strong> Berkeley, California, USA<br data-start=\"1771\" data-end=\"1774\" \/><strong data-start=\"1774\" data-end=\"1786\">Funding:<\/strong> Fully funded, including stipend, research budget, visa support (if required), and workspace access<br data-start=\"1885\" data-end=\"1888\" \/><strong data-start=\"1888\" data-end=\"1913\">Application Deadline:<\/strong> September 26, 2025<br data-start=\"1932\" data-end=\"1935\" \/><strong data-start=\"1935\" data-end=\"1961\">Fellowship Start Date:<\/strong> January 2, 2026<\/p>\n<hr data-start=\"1981\" data-end=\"1984\" \/>\n<h2 data-start=\"1986\" data-end=\"2007\">Who Should Apply<\/h2>\n<p data-start=\"2009\" data-end=\"2058\">The fellowship is suitable for individuals who:<\/p>\n<ul data-start=\"2060\" data-end=\"2582\">\n<li data-start=\"2060\" data-end=\"2128\">\n<p data-start=\"2062\" data-end=\"2128\">Are motivated to <strong data-start=\"2079\" data-end=\"2125\">reduce catastrophic risks from advanced AI<\/strong>.<\/p>\n<\/li>\n<li data-start=\"2129\" data-end=\"2293\">\n<p data-start=\"2131\" data-end=\"2293\">Bring <strong data-start=\"2137\" data-end=\"2179\">technical or domain-specific expertise<\/strong> relevant to AI safety, including technical research, security, governance, policy, strategy, or field-building.<\/p>\n<\/li>\n<li data-start=\"2294\" data-end=\"2417\">\n<p data-start=\"2296\" data-end=\"2417\">Are seeking to <strong data-start=\"2311\" data-end=\"2357\">transition into a full-time AI safety role<\/strong> or start an AI safety-focused initiative or organization.<\/p>\n<\/li>\n<li data-start=\"2418\" data-end=\"2582\">\n<p data-start=\"2420\" data-end=\"2582\">May have <strong data-start=\"2429\" data-end=\"2459\">prior AI safety experience<\/strong>, but it is not required. Many impactful fellows have entered from adjacent fields and quickly contributed significantly.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"2584\" data-end=\"2694\">The program encourages applications from motivated individuals even if they do not meet every qualification.<\/p>\n<hr data-start=\"2696\" data-end=\"2699\" \/>\n<h2 data-start=\"2701\" data-end=\"2726\">Fellowship Structure<\/h2>\n<p data-start=\"2728\" data-end=\"2788\">The Astra Fellowship is structured into three main phases:<\/p>\n<ol data-start=\"2790\" data-end=\"3499\">\n<li data-start=\"2790\" data-end=\"2937\">\n<p data-start=\"2793\" data-end=\"2840\"><strong data-start=\"2793\" data-end=\"2838\">Orientation &amp; Project Kickoff (Weeks 1\u20132)<\/strong><\/p>\n<ul data-start=\"2844\" data-end=\"2937\">\n<li data-start=\"2844\" data-end=\"2878\">\n<p data-start=\"2846\" data-end=\"2878\">Mentors present project ideas.<\/p>\n<\/li>\n<li data-start=\"2882\" data-end=\"2937\">\n<p data-start=\"2884\" data-end=\"2937\">Fellows scope and initiate their research projects.<\/p>\n<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"2939\" data-end=\"3173\">\n<p data-start=\"2942\" data-end=\"2990\"><strong data-start=\"2942\" data-end=\"2988\">Research &amp; Career Development (Weeks 2\u201312)<\/strong><\/p>\n<ul data-start=\"2994\" data-end=\"3173\">\n<li data-start=\"2994\" data-end=\"3075\">\n<p data-start=\"2996\" data-end=\"3075\">Fellows work closely with expert mentors and Constellation research managers.<\/p>\n<\/li>\n<li data-start=\"3079\" data-end=\"3173\">\n<p data-start=\"3081\" data-end=\"3173\">Receive <strong data-start=\"3089\" data-end=\"3112\">1:1 career coaching<\/strong> and recruiting support from the Constellation talent team.<\/p>\n<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"3175\" data-end=\"3499\">\n<p data-start=\"3178\" data-end=\"3232\"><strong data-start=\"3178\" data-end=\"3230\">Placement, Incubation, and Extension (Weeks 12+)<\/strong><\/p>\n<ul data-start=\"3236\" data-end=\"3499\">\n<li data-start=\"3236\" data-end=\"3295\">\n<p data-start=\"3238\" data-end=\"3295\">Fellows publish research outcomes and present findings.<\/p>\n<\/li>\n<li data-start=\"3299\" data-end=\"3405\">\n<p data-start=\"3301\" data-end=\"3405\">Receive support for <strong data-start=\"3321\" data-end=\"3334\">placement<\/strong> into full-time AI safety roles or <strong data-start=\"3369\" data-end=\"3383\">incubation<\/strong> of new initiatives.<\/p>\n<\/li>\n<li data-start=\"3409\" data-end=\"3499\">\n<p data-start=\"3411\" data-end=\"3499\">High-performing fellows may qualify for a <strong data-start=\"3453\" data-end=\"3496\">program extension of up to three months<\/strong>.<\/p>\n<\/li>\n<\/ul>\n<\/li>\n<\/ol>\n<hr data-start=\"3501\" data-end=\"3504\" \/>\n<h2 data-start=\"3506\" data-end=\"3540\">Mentorship and Research Areas<\/h2>\n<p data-start=\"3542\" data-end=\"3652\">Fellows are mentored by top experts from leading AI safety, governance, and policy organizations, including:<\/p>\n<p data-start=\"3654\" data-end=\"3688\"><strong data-start=\"3654\" data-end=\"3686\">Governance &amp; Policy Mentors:<\/strong><\/p>\n<ul data-start=\"3689\" data-end=\"3803\">\n<li data-start=\"3689\" data-end=\"3729\">\n<p data-start=\"3691\" data-end=\"3729\">Lennart Heim \u2013 Independent\/AI Policy<\/p>\n<\/li>\n<li data-start=\"3730\" data-end=\"3753\">\n<p data-start=\"3732\" data-end=\"3753\">Michael Chen \u2013 METR<\/p>\n<\/li>\n<li data-start=\"3754\" data-end=\"3803\">\n<p data-start=\"3756\" data-end=\"3803\">Additional experts from AI policy think tanks<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"3805\" data-end=\"3828\"><strong data-start=\"3805\" data-end=\"3826\">Security Mentors:<\/strong><\/p>\n<ul data-start=\"3829\" data-end=\"3984\">\n<li data-start=\"3829\" data-end=\"3866\">\n<p data-start=\"3831\" data-end=\"3866\">Buck Shlegeris \u2013 Redwood Research<\/p>\n<\/li>\n<li data-start=\"3867\" data-end=\"3892\">\n<p data-start=\"3869\" data-end=\"3892\">Keri Warr \u2013 Anthropic<\/p>\n<\/li>\n<li data-start=\"3893\" data-end=\"3925\">\n<p data-start=\"3895\" data-end=\"3925\">Nicholas Carlini \u2013 Anthropic<\/p>\n<\/li>\n<li data-start=\"3926\" data-end=\"3984\">\n<p data-start=\"3928\" data-end=\"3984\">Additional mentors from leading AI security institutes<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"3986\" data-end=\"4019\"><strong data-start=\"3986\" data-end=\"4017\">Empirical Research Mentors:<\/strong><\/p>\n<ul data-start=\"4020\" data-end=\"4257\">\n<li data-start=\"4020\" data-end=\"4051\">\n<p data-start=\"4022\" data-end=\"4051\">Trenton Bricken \u2013 Anthropic<\/p>\n<\/li>\n<li data-start=\"4052\" data-end=\"4078\">\n<p data-start=\"4054\" data-end=\"4078\">Sam Bowman \u2013 Anthropic<\/p>\n<\/li>\n<li data-start=\"4079\" data-end=\"4107\">\n<p data-start=\"4081\" data-end=\"4107\">Alec Radford \u2013 Anthropic<\/p>\n<\/li>\n<li data-start=\"4108\" data-end=\"4136\">\n<p data-start=\"4110\" data-end=\"4136\">Stephen McAleer \u2013 OpenAI<\/p>\n<\/li>\n<li data-start=\"4137\" data-end=\"4171\">\n<p data-start=\"4139\" data-end=\"4171\">Scott Emmons \u2013 Google DeepMind<\/p>\n<\/li>\n<li data-start=\"4172\" data-end=\"4257\">\n<p data-start=\"4174\" data-end=\"4257\">Additional senior researchers from Anthropic, OpenAI, Google DeepMind, and UKAISI<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4259\" data-end=\"4282\"><strong data-start=\"4259\" data-end=\"4280\">Strategy Mentors:<\/strong><\/p>\n<ul data-start=\"4283\" data-end=\"4396\">\n<li data-start=\"4283\" data-end=\"4321\">\n<p data-start=\"4285\" data-end=\"4321\">Ryan Greenblatt \u2013 Redwood Research<\/p>\n<\/li>\n<li data-start=\"4322\" data-end=\"4359\">\n<p data-start=\"4324\" data-end=\"4359\">Julian Stastny \u2013 Redwood Research<\/p>\n<\/li>\n<li data-start=\"4360\" data-end=\"4396\">\n<p data-start=\"4362\" data-end=\"4396\">Eli Lifland \u2013 AI Futures Project<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4398\" data-end=\"4427\"><strong data-start=\"4398\" data-end=\"4425\">Field Building Mentors:<\/strong><\/p>\n<ul data-start=\"4428\" data-end=\"4493\">\n<li data-start=\"4428\" data-end=\"4459\">\n<p data-start=\"4430\" data-end=\"4459\">Alex Fields \u2013 Constellation<\/p>\n<\/li>\n<li data-start=\"4460\" data-end=\"4493\">\n<p data-start=\"4462\" data-end=\"4493\">Lauren Mangla \u2013 Constellation<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"4495\" data-end=\"4668\">Mentorship is delivered through <strong data-start=\"4527\" data-end=\"4611\">weekly 1:1 sessions, small group meetings, office hours, and Slack collaboration<\/strong>, ensuring close guidance and professional development.<\/p>\n<hr data-start=\"4670\" data-end=\"4673\" \/>\n<h2 data-start=\"4675\" data-end=\"4700\">What Fellows Receive<\/h2>\n<p data-start=\"4702\" data-end=\"4791\">The Astra Fellowship provides comprehensive support to enable full-time research focus:<\/p>\n<ul data-start=\"4793\" data-end=\"5486\">\n<li data-start=\"4793\" data-end=\"4862\">\n<p data-start=\"4795\" data-end=\"4862\"><strong data-start=\"4795\" data-end=\"4807\">Stipend:<\/strong> Competitive financial support throughout the program<\/p>\n<\/li>\n<li data-start=\"4863\" data-end=\"4952\">\n<p data-start=\"4865\" data-end=\"4952\"><strong data-start=\"4865\" data-end=\"4885\">Research Budget:<\/strong> Approximately $15,000 per fellow per month for compute resources<\/p>\n<\/li>\n<li data-start=\"4953\" data-end=\"5133\">\n<p data-start=\"4955\" data-end=\"5133\"><strong data-start=\"4955\" data-end=\"4981\">Workspace &amp; Community:<\/strong> Access to Constellation\u2019s Berkeley research center, with ~150 network participants, daily seminars, workshops, shared meals, and AI safety convenings<\/p>\n<\/li>\n<li data-start=\"5134\" data-end=\"5250\">\n<p data-start=\"5136\" data-end=\"5250\"><strong data-start=\"5136\" data-end=\"5173\">Mentorship &amp; Research Management:<\/strong> Guidance from senior experts, 1:1 coaching, and career development support<\/p>\n<\/li>\n<li data-start=\"5251\" data-end=\"5340\">\n<p data-start=\"5253\" data-end=\"5340\"><strong data-start=\"5253\" data-end=\"5276\">Placement Services:<\/strong> Connections to full-time AI safety roles at top organizations<\/p>\n<\/li>\n<li data-start=\"5341\" data-end=\"5486\">\n<p data-start=\"5343\" data-end=\"5486\"><strong data-start=\"5343\" data-end=\"5367\">Incubation Services:<\/strong> Support for launching new AI safety projects, including business operations, communications, hiring, and fundraising<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"5488\" data-end=\"5491\" \/>\n<h2 data-start=\"5493\" data-end=\"5513\">Success Stories<\/h2>\n<p data-start=\"5515\" data-end=\"5580\">Previous fellows have achieved significant outcomes, including:<\/p>\n<ul data-start=\"5582\" data-end=\"5984\">\n<li data-start=\"5582\" data-end=\"5699\">\n<p data-start=\"5584\" data-end=\"5699\"><strong data-start=\"5584\" data-end=\"5613\">Eli Lifland &amp; Romeo Dean:<\/strong> Developed AI 2027 scenarios during Astra, later co-founding the AI Futures Project.<\/p>\n<\/li>\n<li data-start=\"5700\" data-end=\"5781\">\n<p data-start=\"5702\" data-end=\"5781\"><strong data-start=\"5702\" data-end=\"5719\">Michael Chen:<\/strong> Joined METR\u2019s policy team immediately after the fellowship.<\/p>\n<\/li>\n<li data-start=\"5782\" data-end=\"5892\">\n<p data-start=\"5784\" data-end=\"5892\"><strong data-start=\"5784\" data-end=\"5800\">Aryan Bhatt:<\/strong> Currently running a research team at Redwood Research following mentorship through Astra.<\/p>\n<\/li>\n<li data-start=\"5893\" data-end=\"5984\">\n<p data-start=\"5895\" data-end=\"5984\"><strong data-start=\"5895\" data-end=\"5911\">Martin Soto:<\/strong> Works in UKAISI technical staff after extensive learning during Astra.<\/p>\n<\/li>\n<\/ul>\n<p data-start=\"5986\" data-end=\"6138\">These stories highlight the fellowship\u2019s ability to <strong data-start=\"6038\" data-end=\"6081\">launch high-impact careers in AI safety<\/strong> and contribute to transformative research initiatives.<\/p>\n<hr data-start=\"6140\" data-end=\"6143\" \/>\n<h2 data-start=\"6145\" data-end=\"6162\">How to Apply<\/h2>\n<p data-start=\"6164\" data-end=\"6334\">Applications for the <strong data-start=\"6185\" data-end=\"6224\">Constellation Astra Fellowship 2026<\/strong> are now open. Applicants are encouraged to apply early as applications are reviewed on a <strong data-start=\"6314\" data-end=\"6331\">rolling basis<\/strong>.<\/p>\n<p data-start=\"6336\" data-end=\"6502\"><strong data-start=\"6336\" data-end=\"6361\">Application Deadline:<\/strong> September 26, 2025<br data-start=\"6380\" data-end=\"6383\" \/><strong data-start=\"6383\" data-end=\"6409\">Fellowship Start Date:<\/strong> January 2, 2026<br data-start=\"6425\" data-end=\"6428\" \/><strong data-start=\"6428\" data-end=\"6443\">Apply <a href=\"https:\/\/constellation.fillout.com\/t\/kwFFQj7S2Xus\" target=\"_blank\" rel=\"noopener\">Here<\/a><\/strong><\/p>\n<p data-start=\"6504\" data-end=\"6555\">Applicants should submit proposals demonstrating:<\/p>\n<ul data-start=\"6557\" data-end=\"6734\">\n<li data-start=\"6557\" data-end=\"6616\">\n<p data-start=\"6559\" data-end=\"6616\">Interest and motivation in advancing AI safety research<\/p>\n<\/li>\n<li data-start=\"6617\" data-end=\"6659\">\n<p data-start=\"6619\" data-end=\"6659\">Relevant technical or policy expertise<\/p>\n<\/li>\n<li data-start=\"6660\" data-end=\"6734\">\n<p data-start=\"6662\" data-end=\"6734\">Commitment to making a high-impact contribution to the AI safety field<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"6736\" data-end=\"6739\" \/>\n<p data-start=\"6741\" data-end=\"7184\">The <strong data-start=\"6745\" data-end=\"6779\">Constellation Astra Fellowship<\/strong> represents a unique opportunity for motivated individuals to develop their skills, contribute to critical AI safety research, and transition into impactful roles within the AI safety ecosystem. By combining mentorship, career support, research resources, and community engagement, Astra ensures fellows are well-prepared to tackle some of the most pressing challenges posed by advanced AI technologies.<\/p>\n<p><b>For more Information and Applications:<\/b><b><br \/>\n<\/b><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\">Visit <a href=\"https:\/\/www.constellation.org\/programs\/astra-fellowship\">HERE<\/a><\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\">Stay on <\/span><a href=\"http:\/\/opportunitiesforyouth.org\"><span style=\"font-weight: 400\">opportunitiesconnect.com\/<\/span><\/a><span style=\"font-weight: 400\"> for more opportunities. <\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><span style=\"font-weight: 400\"><br \/>\n<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Constellation Astra Fellowship is a fully funded, 3\u20136 month, in-person program at Constellation\u2019s Berkeley research center, designed to accelerate AI safety research and talent development. This flagship fellowship aims to bring exceptional individuals into the field of AI safety, connect them with leading mentors, and support high-impact research projects that address urgent and neglected&#8230;<\/p>\n","protected":false},"author":1,"featured_media":62774,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[23,31,38,45],"tags":[],"class_list":{"0":"post-62773","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","6":"hentry","7":"category-america","8":"category-continent","9":"category-fellowship","10":"category-grants","11":"article","12":"","14":"has-excerpt","15":"has-avatar","16":"has-author","17":"has-nickname","18":"has-date","19":"has-comment-count","20":"has-category-meta","21":"has-read-more","22":"has-title","23":"has-post-media","24":"thumbnail-","25":"has-tfm-share-icons"},"_links":{"self":[{"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=\/wp\/v2\/posts\/62773","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=62773"}],"version-history":[{"count":0,"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=\/wp\/v2\/posts\/62773\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=\/wp\/v2\/media\/62774"}],"wp:attachment":[{"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=62773"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=62773"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/opportunitiesconnect.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=62773"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}