After the discussions on technology at the #IAIA24 conference, I felt a need to speak up about AI from a social development specialist perspective. The alterantive title of this blog post is 'How social specialist will not be replaced by robots', but then I changed it to convey a different message: building on our strengths as humans when doing social development work. The introduction of automation, technology and artificial intelligence that doesn't only collect information, but in certain cases analyses it, provides computation, interpretation, and ultimately makes decision, is a scary prospect for anyone in the social impact assessment field. The fear of bots replacing us and drones conducting virtual socio-economic surveys is farfetched - but very real for many.
But what does technology really mean for social specialists? After all, we are the ones working in the front lines, face to face with people. This article might come as a surprise for those that only know me wearing my ‘technology startup CEO’ hat, but not for those who’ve known me in my previous life as a social development specialist. I must admit, I am pretty ‘low-tech’ and screen-free when it comes to doing the field work. But, not 'low-tech' at all when it comes to automating time-consuming tasks. Working with large language models, automation and simple programming for data retrieval taught me a big lesson: the bots will find an answer for you! They will never say ‘I don’t know’ or ‘I’m not sure’. They will collect whatever answer they think is most relevant to my question and present it as a fact. For chatbots that are connected to the internet, it means hallucinations, false answers and fake research studies. This is the prime criticism of this type of AI that was discussed by E&S experts. The situation is different for chatbots that are only connected to a closed knowledge base. These bots could be – and should be – programmed to say they don’t have the data – aka a bot’s version of ‘I don’t know’.
The fact that AI will find an answer leads to issues around validity of information. As social specialists, we are facing complex issues that require careful case-by-case assessment built on real, true and verifiable data. The decisions we make about impact magnitude or resettlement assistance can have serious consequences on the lives of people affected by development projects. My approach was to find the unique advantage that us humans have that no AI could mimic: we are curious and we can admit our limitations and insecurities – at least some of us! We understand that as a social researcher on the field, we bring our unique perspectives and world views that we apply to the analysis of each situation and issue. We have the capacity to see limitations of the research methodology or the implementation of the methodology. We can acknowledge our biases and prejudices, as well as work in a community of practitioners.
The power of saying ‘I don’t know’ or ‘I’m not sure’ and asking for help from other specialist is incredible and something that can not be replicated by a machine. Identifying the need to work closely with local communities and sitting down for a coffee to listen to their concerns is not something that is easily done by a bot. They might be able to listen to conversations and take notes, but the human connection is what make social impact assessments so personal. Bringing the different perspectives together for a more balanced and objective assessment might be a good place for AI to support a specialist, as AI has been shown to eliminate the human unconscious bias. The best results, in my view, come from human experts talking to other humans and using technology to automate repetitive manual tasks.
I have advocated for an approach where technology is leveraged as a teammate for human experts, almost like an extension of a human to empower them to make more informed decisions. I do believe that we can make technology work for us in the social impact field as well by building on our strengths as humans to evaluate and interpret data and tailor mitigation measures to specific situations. No two projects are the same even if they are in the same industry or same country. Every situation we come across is different. Humans have the ability to pick up on non-verbal clues or half sentences that might lead to big revelations changing our whole approach to compensation or entitlements. We have the ability to observe communities that don’t have a digital footprint, such as indigenous communities and give them a voice.
The technology is there to provide solutions – we need to chooe wisely which problems we want to solve!
Comments