top of page
Ildiko Almasi Simsic

How Copilot will not solve all your E&S problems

I bet your company has a strategy to incorporate technology in the name of efficiency. If I had to guess, I would also say that they are adamant to use providers you already have access to e.g. Microsoft. If was only a matter of time before the other teams in your company got on board, because Copilot was in fact helping them with more generic tasks. If you are in an E&S team, you are probably offered the chance to trial Copilot, but you’re not sure what to do with it. Surely summarising notes, chatting to a single document or asking for suggestions to improve your report save time and make your life easier. But let’s see why Copilot might not be your one stop shop when it comes to doing the E&S work. Copilot is a generalist LLM that can do simple generic tasks. There are much better research LLMs out there! But this is not an infomercial so let’s see the facts:

 

Privacy

I’m starting with the main concerns that were raised regarding Copilot. Privacy concerns emerged due to its cloud-based nature and potential for data leakage. It can access all the data a user can access, which may sometimes be more than necessary. The complexity of permissions means that enforcing least privilege access is challenging, leading to potential exposure of sensitive data. Furthermore, Copilot’s ability to generate new content from existing data can create potential mishandling of sensitive information. Fun fact: the US Congress has banned the use of Copilot for its staff stating ‘the Microsoft Copilot application has been deeded by the Office of Cybersecurity to be a risk to users due to the threat of leaking House data to non-house approved cloud services’.

 

Other open-source LLMs, such as Llama are considered more secure in terms of privacy because they are deployed locally and don’t leave the users’ system. Open-source LLMs allow for full transparency and audibility for security vulnerabilities. These models can also be customised to limit data exposure and to only what’s necessary, reducing the risk of privacy breaches.

 

Security

Microsoft claims to have implemented security measures for Copilot, such as access control and proper user rights and privileges. However, cloud-based AI assistants like Copilot are inherently more risky in terms of security compared to locally run LLMs. This is because they rely on external servers and potentially vulnerable networks, increasing the likelihood of data breaches and unauthorised access.

 

What it means in the E&S world is that pre-disclosure documents and ESMSs that are not disclosed have the potentially to be accessed by the model and ‘leak’ information to other users. This issue is further complicated if the team is working with Chinese walls, where it is almost impossible to limit the AI’s training data and prevent certain users from accessing information.


Microsoft has gone far this year with a new feature called Recall. It is probably the scariest thing, bordering Black Mirror territory. This feature logs every action a user performs through continuous screenshots that can be ‘recalled’ through user searches. Yes, it is sci-fi territory where people can see everything you did on your computer – including passwords and sensitive financial information. Microsoft stated that Recall ‘does not take snapshot of certain kinds of content, including InPrivate web browsing sessions in Microsoft Edge’. This is a not-so-subtle push to attract users to the less popular Edge browser. The UK data regulator also initiated an enquiry into the Recall feature, while the Information Commissioner’s Office (ICO) simply compared this feature to spyware.

 

Training data

Copilot’s reliance on existing data labelling practices from other Microsoft products can result in amplifying inherited errors or outdated information. It is because terabytes of outdated and irrelevant data can be suggested as relevant content, which may lead to incorrect or misleading responses.

 

Open-source LLM models avoid this issue by allowing users to fine-tune the model for specific tasks and industries, reducing the risk of suggesting irrelevant or outdated content. Additionally, its local deployment means that it doesn’t rely on external data sources, minimising the risk of incorrect or outdated information.

Customisability

The last point about closed LLMs is their limited customisability which can hinder effectiveness for specific tasks or industries. Its training data comes from a wide range of sources, including public domains, which may not be relevant or accurate for certain fields. In contract, the open-source models allow users to fine-tune the model for specific tasks or industries.

 

The data that is used to train the LLM is probably a lot more important than what model you use, because the improvement in capabilities is very rapid. That’s why when we designed myESRA, we spent months thinking through the relevant, reliable and accurate data sources that we can collect to make sure that your query is answered correctly. There is no need to upload anything to myESRA and it has no access to your private documents in your cloud or desktop.

 



Today, we got the news that Microsoft is launching the next iteration of Copilot to expand features to integration across key Microsoft 365 apps like Excel, PowerPoint, Teams, Outlook, Word or OneDrive. What is means is that Copilot Pages will allow AI-assisted teamwork with real-time, multi-user interaction, Copilot Agents will automate complex business processes etc. Microsoft is working hard to make it easier for anyone to add AI to their generic workflow. While we believe that AI is a fantastic tool, there is a difference between more generic AI systems and those developed to support experts in different niche areas, such as E&S. You can always visit www.essolutions.ai if you want to see what myESRA is about.

11 views0 comments

Recent Posts

See All

Comments


bottom of page