Milestone Systems has decided to adopt and implement the G7 Code of Conduct for advanced Artificial Intelligence (AI) systems, becoming one of the first companies to do so.
“We need rules to ensure AI is being developed to serve humanity. But companies should not wait for regulation. They must take their steps to identify and resolve the weaknesses and pitfalls of the AI they develop,” says Thomas Jensen, CEO of Milestone Systems, and continues: “When it comes to AI-enabled video, we have just scratched the surface of its potential benefits and uses. However, we also understand some of the pitfalls such as bias and false positives.”
“At Milestone Systems, we are taking significant steps to address potential weaknesses of our tools. By signing up to the G7 Code of Conduct we will continue to focus our efforts on building our software with trust, transparency, and accountability at the front of our minds.”
The International Code of Conduct for Organisations Developing Advanced AI Systems aims to promote safe, secure, and trustworthy AI worldwide. It was agreed by G7 leaders at the end of October 2023 alongside a set of Guiding Principles for the world’s most powerful democracies to follow when developing new AI systems.
Milestone System’s decision comes as the European Union has agreed on its own AI Act.
Applaud the AI Act
“While we applaud the AI Act, it will take a while before it is implemented, potentially a couple of years. In the meantime, we believe companies should strive to stay ahead of the regulation,” Thomas Jensen says and adds:
“We must not shutter innovation, but to prevent a public and regulatory backlash, AI businesses should be striving to build trustworthy AI. Adopting the G7 Code of Conduct is one such step all companies should take to help ensure responsible use of technology and foster public trust in the new possibilities it presents.”
The G7 AI Code of Conduct can be found here: https://digital-strategy.ec.europa.eu/en/library/hiroshima-process-international-code-conduct-advanced-ai-systems
Discussion about this post