Risk and responsibility
As different forms of AI evolve, the organisations developing it and those using it need to be fully aware of the risks and responsibilities involved
“I don’t think there is any AI in social housing, at least not in the way that people assume AI is.”
This is Ryan Dempsey’s forthright assessment of where the sector currently stands in relation to the technology. As the founder and chief executive of TCW, a service that uses compliance data to advise housing clients managing nearly 2 million homes between them, Dempsey’s own business uses AI on a daily basis. But he makes an important distinction between risk-based AI, which is predicated on human instruction, and AGI, or Artificial General Intelligence.
Context-based AI
“When people talk about AI, about it making and learning from the information given to it and making decisions that the user doesn’t know it’s going to make, what they are talking about is general intelligence to a point where it’s leaning into super intelligence.
“The issue you’ve got is that AI doesn’t exist yet, not in a true form. You might have the likes of Microsoft or Amazon playing with it, but within housing, we have something much more like context-based AI.”
This context-based AI, Dempsey explains, is one that might combine existing capabilities and technologies to drive efficiency. He gives the example of someone telling a friend they want a fresh sandwich: “Now, as soon as I’ve said that, the technology is going to say ‘well, I know where you are geographically because I have location services; I also know that there’s a Google map somewhere that I can search outwards to find the nearest bakery, so I can suggest a route to take you to the nearest bakery; and I know what the cost of it is, and I can get all of that from the conversation we’ve just had.’
“I think that capability is in housing right now. There’s a couple of brands out there that are harnessing that kind of context-based AI and it works very well. Now that’s really clever, and it’s really good stuff, and it’s going to create so much efficiency within the sector. But the issue you have is that when people are saying ‘come and use our AI; we’ve built an AI model’, they haven’t. What they’ve done is they’ve created the facility to make things more efficient and they’re trying to promote something up here that really is something that’s a little bit lower down in the chain of command.”
Ryan Dempsey
Founder and CEO, TCW
“There needs to be a form of accountability to that individual when the council or the authority gets fined millions of pounds for data protection breaches.”
Ryan Dempsey
Founder and CEO, TCW
“There needs to be a form of accountability to that individual when the council or the authority gets fined millions of pounds for data protection breaches.”
A word of caution
Beyond driving efficiency, Dempsey insists that social housing providers and any organisations supplying their technology need to be especially careful when deploying AI – either as it exists now or future in iterations – because of the particular responsibility they have when it comes to their customers.
“What they’ve got to try to understand is that when they’re promoting the service that they’ve built and they’re trying to sell into a sector, they’re going to have to do that with a greater sense of integrity and accountability,” he says.
“Because if you’re telling a client that you can use AI, but you have to hide the fact that there is a risk of bias or a risk around breaching regulations – and you’re doing that because you want to improve your bottom line or close a deal for a contract – then there needs to be a form of accountability to that individual when the council or the authority gets fined millions of pounds for data protection breaches.
“If you’re telling them you can do X and you provide Y and they assume that they’re covered, and then they get dragged through court, the person who told them that they can do X should be accountable and dragged through court with them. And I don’t think that’s well understood in the sector at the moment.
“I think when people say they use AI, everybody’s eyes light up and they get really excited at having loads of robots walking around, cooking us dinner, cleaning our houses and changing our duvets. But the reality of it is, that’s not going to be the case.”
“When people say they use AI, everybody’s eyes light up and they get really excited at having loads of robots walking around, cooking us dinner... But the reality of it is, that’s not going to be the case.”
In the public eye
Dempsey believes housing providers need to strike a balance between these responsibilities and the need to innovate to drive change in the sector.
“Social housing is always in the public eye and unfortunately it gets a lot of grief. It’s been shot down a lot about how bad it is, when in fact, the way in which social housing manages quality and competence and risk is pretty good.
“But it can improve – it always can. And I think the issue is because it’s in the public eye, organisations need to be seen to be innovating to try to improve in an efficient way. They might think AI can provide that when, in actual fact, it probably can’t to the level that they need it to.
“So the sector needs to be aware of how good it is now at doing the stuff it does, but also be acutely aware of what technology is available and whether or not that can slowly fit into the process to improve the organisation over a 20 to 30-year period. It’s not about bringing this computer in, shoving in our database, and tomorrow – voila - we’re in a much better position.”
Dempsey acknowledges that he can sound ‘negative’ about the prospects of AI transforming the social housing sector. That is far from the case, but he emphasises caution when it comes to how the technology should be handled.
“It is exciting, and it is going to change the world, but we just need to be very careful. It’s about being accountable for what you’re saying, not just what you’re delivering.”