Quantcast
Channel: News for CERN Community feed
Viewing all articles
Browse latest Browse all 3399

Computer Security: Software Bugs: What if?

$
0
0
Computer Security: Software Bugs: What if?anschaefTue, 06/11/2019 - 10:53

Do you know what drugs and proprietary software have in common? You bear the consequences if the product you buy is of mediocre quality. There is no possible recourse. Your investment is lost. The big difference is that buying software is legal. Still, there is no chance that you can hand back software that is buggy, return it to the software manufacturer and ask for it to be fixed (OK, you can ask, but…) or press for financial compensation. Instead, you as an individual, or we as an organisation, have to invest additional money in protecting our software stack and its inherent bugs against abuse… So, how can we create an efficient incentive to improve software quality? Legally enforced Bug Bounty Premiums.

While many big software manufacturers already employ so-called secure software development lifecycles to improve their products, many others just come full of bugs in order to be first on the market. The user is the beta tester. Security comes… later. There is just no incentive to guarantee that at least the obvious blunders are corrected. Very frequently, in particular for devices on the Internet-of-Things, the software stack (operating system, network interface, web server, user interface) is just a hack, as the producing companies have no good knowledge of software design and security. Their business is the device itself: thermometers, cameras, you name it. And they just make them “intelligent” by connecting them to the Internet. The same is true for smaller software development companies, they have a great idea to market, but neither the personnel nor the time to ensure a secure design and a software product with as few bugs as possible. Others just don’t care (enough). There is just no incentive to invest in security, except for one: reputation. And looking at the past record of published software blunders in the media, rarely does a company goes bust due to a security bug*. So, why care?

How to create an incentive for more secure software? Legally enforced Bug Bounty Premiums! A “Bug Bounty” programme today is a voluntary commitment by a company to pay you a certain amount of money if you report a software bug found in its products. Google runs one. Microsoft does. CERN does too (but, as we are taxpayer funded, we can hand out only t-shirts as a reward). Unfortunately, many other software developers don’t. And this is where legally enforced Bug Bounty Premiums would help. National governments, the European Union, or ideally a global organisation, should come up with a defined “price list” for bugs, and legally enforce any software manufacturer to pay that money to the first person that finds one. The infrastructure for recording bugs and keeping track of fixes has already been in place for a while: CVEs (“Common Vulnerabilities and Exposures”). A cross-site scripting bug gets you, say, $100; SQL injection, $200; command line injection, $1000; a root exploit, $10 000; etc. And, by law, software manufacturers would be forced to pay that sum to the first finder.

So here come the incentives: either they pay the Bug Bounty Premium, or they invest in better software development processes in-house, or they engage with third parties to find weaknesses before Bug Bounty hunters do. But there are more advantages! Legally enforced Bug Bounty Premiums open a guaranteed revenue stream for software savvy people. Security researchers. Computer engineers. IT students. Anyone who loves to poke into software and hunt for defects can make some additional decent money. And also those who tended in the past to sell their findings illegally on the dark market – they now have the option to move out of illegality and cash in legally.

Of course, there are some lemmas to take into account, namely “software dissemination” and “open source”. For the former, instead of having fixed premiums ($100, $200, $1000, $10 000), the premium should scale with the dissemination of the buggy software. For that small library I wrote, used just by you and me, and where you found a bug, you make hardly any money. But if you find a vulnerability in a major operating system, a dominant web browser, or a widely used library: bingo for you! And open source? This is where the state comes in. The premium is paid out of a national, European or international pot. Maybe this is the most problematic point, but in the long run, it provides another incentive to software manufactures: instead of maintaining (old) proprietary software and eventually paying out for bugs, they can consider making their source code public and open source – and the liability to pay in the event of bugs is gone. Benefit for the community: more open source code!

So, what if?

_____

Do you want to learn more about computer security incidents and issues at CERN? Follow our Monthly Report. For further information, questions or help, check our website or contact us at Computer.Security@cern.ch.

*The European general data privacy regulation changes this drastically when the bug exposes private data. Companies in breach of the regulation are fined.


Viewing all articles
Browse latest Browse all 3399

Trending Articles