Seb Barker, COO of Beam writes about their AI tool ‘Magic Notes’ and how this can improve support for people experiencing homelessness.

“The feedback from service users is great but we’re miles behind on paperwork - is there anything we can do to help?”

Graham, who was on my team when I worked in the Drug and Alcohol Service, told me this just about every time we had a catch-up.

Graham’s job was to assess people in homeless hostels for detox and rehab, ultimately with the goal of helping them get back into a long-term and stable home. He was passionate about the cause, having been through the system himself a few years back. This was his first full-time job and he initially loved being out in the field, enthusiastically relating his own experiences to those considering treatment.

But after a few months in the role, the charity we worked for brought in a new database. Instead of handwriting and scanning notes, Graham now needed to type up assessments.

On a good day, this added an hour. But on most days, it was much longer.

After months of these conversations, and with Graham’s job now heavily weighted towards typing at a computer, it felt sad, if not surprising when he announced he was leaving to move into a new role - within another sector.

Graham’s predicament is sadly all too common. Studies show support workers can spend around 70% of their time writing up notes - that’s the majority of the working week.

We saw the exact same challenge first-hand when we started Beam.

Beam has now directly supported more than 5000 people into homes and jobs over the last seven years. In doing so, we’ve seen first-hand the power of support workers to change lives. Through their empathy, kindness, support and deep knowledge and experience, they are the difference between someone sleeping on the streets and finding a home they can move into.

But, too often, they’re held back. Bad and outdated technology slows them down and means they spend more time navigating complex systems and less time out directly supporting people.

AI offers one way to shift this balance, allowing professionals to focus more on people than paperwork.

How AI can improve support

Our AI tool, Magic Notes, transcribes meetings between frontline workers and service users, and provides a first draft of a casenote, assessment, letter, or report that the professional can then finalise.

Independent research, including evaluations led by Professor Robert Procter at the University of Warwick, has shown that this tool can give each professional eight hours back per week - an extra working day.

But this is about much more than productivity: AI can also enhance the quality of interactions.

Without the distraction of note-taking, professionals can maintain eye contact, actively listen, and engage in more meaningful conversations. This not only improves the experience for individuals receiving care, but also strengthens the support they receive. The success of this approach has seen it expand beyond homelessness, now being implemented in social and community care.

Using AI safely

While AI has the potential to prevent homelessness, it must be done responsibly.

And this is especially for organisations working at the frontline of supporting the most vulnerable members of our communities, like charities and local authorities.

In the development of Magic Notes, we’ve used ‘safety’ as a core design principle - and our framework for safe AI development revolves around privacy, security, responsibility and accuracy.

Privacy

Handling sensitive personal data requires robust safeguards. Government and third sector bodies should ensure that AI providers comply with strict data protection regulations and that data remains securely stored within the UK.

Security

AI tools must be designed with top-tier cybersecurity measures to prevent breaches. Ensuring compliance for industry-standard security certifications is one way to do this.

Responsibility

AI should augment, not replace, human expertise. The best tools ensure that professionals remain in control by requiring human oversight of all AI-generated outputs.

Accuracy

AI must provide reliable and clear outputs. We should seek solutions that have been tested, with transparent methodologies and continuous refinement based on real-world use and frontline worker feedback.

Technology like AI can make frontline work more effective, freeing up professionals to focus on the people who need them most. In doing so, it can be a vital tool in our arsenal in the fight against homelessness.

But responsibility and safety must be at the forefront of our minds as we deploy it.