The Longtermism Fund is pleased to announce that we will be providing grants to the following organisations in our first-ever grantmaking round:
These grants will be paid out in January 2023, though the amounts were chosen based on what the Fund has received in donations as of today.
In this grants report, we will provide more details about the grantmaking process and the grantees the fund is supporting. This report was written by Giving What We Can, which is responsible for the fund's communications. Longview Philanthropy is responsible for the Fund's research and grantmaking.
Longview actively investigates high-impact funding opportunities for donors looking to improve the long-term future. This means the grants are generally decided by:
In addition to this, the Fund decided to support a diverse range of longtermist causes this grantmaking round. Our current plan is to provide grant reports approximately every six months.
The scope of the Longtermism Fund is to support organisations that are:
In addition, the fund aims to support organisations with a compelling and transparent case in favour of their cost-effectiveness that most donors interested in longtermism will understand, and/or that would benefit from being funded by a large number of donors.
There are several major risks to the long-term future that need to be addressed, and they differ in their size, potential severity, and how many promising solutions are available and in need of additional funding. However, there is a lot of uncertainty about which of these risks are most cost-effectively addressed by the next philanthropic dollar, even among expert grantmakers.
Given this, the Fund aimed to:
A benefit of this approach is that the grants highlight a diverse range of approaches to improving the long-term future.
We expect the Fund to take a similar approach for the foreseeable future, but we could imagine it changing if there is a persistent and notable difference in the cost-effectiveness in the funding opportunities between causes. At this point, so long as those opportunities are within the Fund’s scope, funding the most cost-effective opportunities will likely outweigh supporting a more diverse range of causes.
This section will provide further information about the grantees and a specific comment from Longview on why they chose to fund the organisation and how they expect the funding to be used. Each page links to either a charity page, where you can learn more about the grantees and support them through a direct donation, or to a public writeup to describe the grantees’ work in more depth.
CHAI is a research institute focused on developing AI that is aligned with human values and will ultimately improve the long-term future for all beings. Their work includes conducting research on alignment techniques such as value learning, as well as engaging with policymakers and industry leaders to promote the development of safe and beneficial AI.
Longview: “We recommended a grant of $70,000 to CHAI on the basis of CHAI’s strong track record of training and enabling current and future alignment researchers, and building the academic AI alignment field. These funds will go toward graduate students, affiliated researchers and infrastructure to support their research.”
SecureBio is led by Prof. Kevin Esvelt of the MIT Sculpting Evolution Group — he is one of the leading researchers in the field of synthetic biology. In addition to contributing to developments in CRISPR, and inventing CRISPR-based gene drives, his work has been published in Nature and Science, and covered in The New York Times, The New Yorker, The Atlantic, PBS and NPR. SecureBio is working on some of the most promising biosecurity projects for tackling the scenarios most likely to cause permanent damage to the world, such as preventing bad actors from abusing DNA synthesis, infrastructure for detecting the next pandemic very early, and advanced PPE and air sanitisation.
Longview: “We recommended a grant of $60,000 to SecureBio on the basis of their intense focus on the most extreme biological risks and their track record of developing promising approaches to tackling such risks. These funds will go toward research scientists, an air sanitisation trial and setting up a policy unit.”
The General Longtermism Team at Rethink Priorities works on improving strategic clarity about how to improve the long-term future, such as whether and how much to invest in new longtermist causes. They have written about their past work and future plans in a recent EA Forum post.
Longview: “We recommended a grant of $30,000 to Rethink Priorities’ research and development work in a range of established and experimental existential risk areas on the basis of Rethink Priorities’ track record of generating decision-relevant research across a range of causes, and the promise of their research directions in existential risk areas. These funds will go toward researchers (including fellows) and research support.”
The Council on Strategic Risks is an organisation that focuses on reducing the risk of catastrophic events, including nuclear war. Their work on nuclear issues includes conducting research on nuclear proliferation and deterrence, engaging with policymakers to promote nuclear disarmament, and raising awareness about the risks of nuclear conflict.
Longview: “We recommended a grant of $15,000 to the Council on Strategic Risks’ nuclear programmes on the basis of their focus on the most extreme nuclear risks, ability to propose concrete policy actions which would reduce the risk of escalation into nuclear war, and strong networks within U.S. national and international security communities. These funds will go toward policy development and advocacy, as well as fellowships.”
We are grateful to the support of the 295 donors who have raised over $175,000 USD in the fund’s first few months. We are excited to support the work of these organisations and believe that their activities are likely to have a significant positive impact on the long-term future. Please feel free to ask any questions in the comments of the version of this grants report on the EA Forum.