Top 5 AI ‘must-knows’ for philanthropy and not-for-profits leaders

In the second part of this series, Kristi Mansfield, Founder & CEO, Seer Data, outlines what you need to know to avoid the hype and implement AI safely.
I’ve been in tech for 25 years and I can safely say that it is mind-boggling how quickly AI is evolving. Tech and data leaders are struggling to keep up, let alone anyone else. Here’s what philanthropic and not-for-profit (NFP) leaders and boards need to know now.
1. Like it or not, GenAI is reshaping grantmaking and NFP operations
Generative AI (GenAI) is increasingly used to streamline communications, identify or score grants, support reporting, conduct needs analysis and tell stories of impact. Service design, product innovation, predictive modelling and advanced analysis are also part of the future opportunity. Efficiency is the obvious benefit.
While GenAI can save hours of manual labour, there are important considerations to know.
Inconvenient truths:
- Gen AI can speed the process, but doesn’t replace crafting skills, sense-making and storytelling. AI often ‘hallucinates’ and produces mistaken facts or interpretation.
- Asking AI to score applications or make judgments will echo back to dominant narratives. If you care about changing the dominant narratives, you need to be aware of AI’s in-built bias.
- The ‘share’ feature of ChatGPT, and other AI chatbots, exposes your chats to search engines.
- GenAI tools are important, yet we all know philanthropy and giving centres on trusted relationships.
- Very few organisations have guardrails in place to use AI tools safely. Larger NFPs are ahead on this front and philanthropic foundations are lagging.
- Data quality is inconsistent, so until there is more investment in data and tech systems across the sector, AI applications are limited to GenAI. Other uses could include predicting needs, optimising supply chains or operations, personalising services and advancing analytics for increasing impact.
- NFP and foundation boards currently lack the skills in tech, cyber, data and AI to deploy AI safely and ethically.

We’re in the early stages of the hype adoption cycle (an initial excitement stage in the process of new tech reaching acceptance and productivity), so NFPs and philanthropy must know the right questions to ask when developing an AI strategy and working with AI tech vendors (which are springing up weekly). Members of the Seer Data Board, all of whom are global leaders in data, tech and AI, regularly caution our team is treat AI vendors with cynicism.
Remember, the 2024 reforms to the Privacy Act raised the penalties for serious breaches of privacy (up to $2.5m for individuals and $50m for organisations) and require organisations and individuals to take reasonable precautions to protect personal information and re-identification through clear policies, accountability and secure data practices.
2. The NFP sector is medium-high exposure. Frontline roles stay human
AI is likely to disrupt admin-heavy roles such as reporting, donor relations and compliance, but direct support, advocacy and building trusted human relationships with partners remain at heart social services.
Workforce strategies should now focus on reskilling for strategic and relational expertise, not just task automation.
3. Transformation is guaranteed, stability is not
AI is rapidly reshaping financial services, legal, software, logistics, retail and others. This has knock-on effects: foundations or donors in these sectors may change behaviours, and communities may face new employment shifts. Foundations and NFPs should consider the impacts on potential communities they fund and serve over the near-to-medium term.
Inconvenient truths:
- No one actually knows the full extent of the potential disruption to Australian society through job losses and displacement. While technology companies project thousands of jobs will be created in AI, it’s unclear how many will be lost or not open up to graduates.
- In the next two decades, it’s likely society will become unfamiliar. Philanthropy’s role will be immense and boards should be planning now to for effective responses and mobilisation.
Boards should ask: how could AI disruption affect our strategy now, in three years and in the next decade? Who should we be partnering with, and do we have the right skills on our board?
4. Develop the guardrails for AI use
AI assurance is critical if you’re using or planning to use AI.
Practical frameworks already exist. The Australian Government’s pilot AI Assurance Framework, led by the Digital Transformation Agency (DTA), helps assess AI projects against Australia’s AI Ethics Principles. It’s designed to ensure safe, ethical and transparent deployment.
Australia’s National Framework for the Assurance of Artificial Intelligence in Government lays out structures to ensure AI aligns with human rights and ethics.
Seer Data will soon make available a short guide on how boards can develop compliance actions to voluntarily adopt the guardrails outlined in the Australian AI voluntary safety standard. Learn more on the website.
5. Transparency is non-negotiable for community trust
Trust is the foundation of philanthropy and the NFP sector. If AI is used in grant scoring, reporting or impact measurement then stakeholders and communities must know. Being clear about AI’s role protects credibility and supports ethical engagement.
This aligns with emerging standards in Trustworthy AI, which emphasise transparency, explainability, accountability and privacy protections.
So, what should you do?
In my first article, I outlined 3 priority actions:
- Educate your board and executive team.
- Prioritise data foundations and governance.
- Create guardrails through an AI strategy and assurance framework.
Here are 3 more priority actions you should take now.
- Invest in workforce readiness: skill up teams to build AI awareness and application in your organisation. Focus on building skills in strategic, relational and community roles, while preparing for automation of admin-heavy tasks for efficiency. Infoxchange offers a number of fantastic AI resources and free courses. Consider if you have the adequate skills on your board. If not, look for those skills.
- Prioritise transparency and trust: be transparent about when and how AI is used in scoring, reporting, service design or storytelling to maintain credibility and community trust.
- Systems for data management and quality: data quality and underinvestment in safe and secure systems for storage and management are still big problems in our sector. Investment in safe technology is immature and incomplete for many NFPs, and it must be invested in to make best use of data analysis and AI.
Learn more
Australian Charities and Not-for-‑Profits ‑Commission (ACNC) guide: Charities and artificial intelligence.
Thank you to Seer Data Board Member Dr Ian Oppermann, Jo Garner, my team, many of our customers who are NFPs and philanthropic foundations for your review and contribution to this article. Your insights and feedback have been essential.
Kristi Mansfield is a global leader in AI and data with 25 years’ experience across technology, data governance, cross-border data sharing, strategy, and philanthropy. As Founder and CEO of Seer Data, she has represented Australia at the G20 on data-sharing across borders, was on the working group for the development of the Framework for the Governance of Indigenous Data, and led transformative roles in corporate, technology and NFP sectors.