President Donald Trump is looking to overhaul the government’s computer systems and use of technology, after meeting with major tech CEOs to discuss changes to a system that still employs the use of floppy disks in parts.
Speaking at the meeting, Trump said: “Our goal is to lead a sweeping transformation of the federal government’s technology that will deliver dramatically better services for citizens, stronger protection from cyberattacks… and up to a trillion dollars in savings for taxpayers over the next 10 years..
“We’re embracing big change, bold thinking and outsider perspectives to transform government and make it the way it should be. And at far less cost.”
The urgency to make changes, especially in the use of cloud technology, also comes as a response to surprising fact that some US federal agencies have systems as old as 56 years – and still use floppy disks. This was revealed by Trump’s senior advisor, and son-in-law, Jared Kushner, who spoke for the first time in public to inform us of how archaic the US federal systems are.
“It turns out that federal agencies collectively operate 6,100 data centres – the vast majority of which can be consolidated and migrated to the cloud,” said Kushner. “Many of our federal systems are decades old, with our 10 oldest being between 39 and 56 years old. The department of defence, for example, uses eight-inch floppy disks on some of its legacy systems.”
At the meeting, Trump was urged by Apple CEO Tim cook to focus more on citizens when deciding what services to employ because citizens are often neglected. Furthermore, Amazon CEO Jeff Bezos informed Trump that the US need to focus on AI and machine learning within government to improve the services for citizens.
The desire to use technology in the government could be brought into question, however, following a data leak of more than 198m US voters. The leak, discovered by UpGuard, came as a result of the data being left exposed by Republican data firm Deep Root Analytics, which stored the data in a publicly accessible cloud server. The data included names, dates of birth, home addresses, phone numbers, and voter registration details, as well as information regarding ethnicity and religion.
“This is highly illustrative of today’s data governance problem – huge amounts of data (much of it sensitive) from different original sources, when aggregated together can increase both the value of the data set and the consequential privacy impact from such a breach. Ultimately the more data you have on an individual, the more you know about them – and if that data set gets into the wrong hands it creates a whole host of potential issues,” said Peter Galdies, technical director at data governance, risk and compliance consultancy DQM GRC.
“Enhancing this problem is the extensive ‘food chain’ of organisations managing personal data including third party data providers, analytical processors and those client organisations using the data for their own business purposes. This means the management and protection of individual rights to privacy is hard to ensure.
“If this data had belonged to European or UK residents then this would have qualified as a hugely serious breach of the new GDPR law (particularly as voting preferences are defined as “special categories” of sensitive data) - potentially resulting in very serious consequences to all the organisations found responsible, including the data processors,” Galdies continued.
“It’s imperative that organisations which are commissioning agencies or other third parties to manage data on their behalf conduct serious due diligence before engaging them to ensure their data handling processes are sufficiently robust to protect the personal information of individuals. Relying on contracts will simply not be enough anymore.”