Don't worry, we speak : Español (Spanish), too!
Contxto – Another participant from Innóvate Perú—a government program—is steadily growing. This time around it is security startup Veronica Core, which revealed that its artificial intelligence (AI) tech is being used by Peruvian customs to oversee the country’s border with Bolivia.
Through its AI software and surveillance systems, Veronica’s solution can notify authorities if an inbound vehicle is blacklisted in Peru or Bolivia.
Jarvis. I mean Edith. No, wait… Veronica
“Veronica,” stands for Video Efficient Recognition of Named Identities and Content Analysis was launched by Adolfo Pizarro, Isabel Melgar, and Raúl Diez Canseco in 2015.
Veronica’s technology uses the feed from security cameras to identify license plates, faces, and other objects. This is possible thanks to its use of artificial neural networks and internet protocol (IP) cameras. The system identifies patterns and then makes security-based decisions based on the learned-input.
The purpose of its software is to automate crime detection as well as prevent wrongdoing. Information can be channeled to law enforcement authorities for follow-up.
This security tech can be applied to any industry where surveillance is needed, such as banks. It also can be used in traffic to oversee cars with unpaid tickets or if a motorcyclist isn’t using a helmet.
Innóvate Perú chose Veronica for its 2013-2014 startup batch. This resulted in an investment of PEN$450,000 (about US$132,000). Among its other startup-related accomplishments was winning the Wayra Perú contest in 2013, as well as being a finalist at Seedstars Lima 2016.
Veronica was also installed in the founders’ alma mater, Saint Ignatius of Loyola University, in Lima, to ensure campus security.
One of the startup’s primary goals is to make it financially viable for any company or government agency to enhance security through its software.
Big Brother’s watching you
The Caped Crusader in Christopher Nolan’s Dark Knight illustrates how useful but also dangerous surveillance technology can be. While it can help to catch the Joker, it has fuzzy ethical implications.
A real-life example of an ethical dilemma using tech was shown with the revelations of one Edward Snowden in 2013. Plus if this tech falls into the wrong hands, it can be used to hurt others.
In that sense, I encourage anyone who uses AI tech, to consider the ethical boundaries of its use.