Notes from PyGTA's Programmer Liability Round Table
Written by
on
in
Snaking.
We had a lively discussion last night about programmer liability. Consensus (such as there was) seemed to be that perfectly reliable software is extremely expensive and likely futile as a goal given complexity restrictions, basically if every app must meet milspec the industry as we know it is toast, as costs would go through the roof. We discussed paths forward that didn't require milspec reliability, things such as codified best practices which when followed essentially dissolve liability, along with social mechanisms to alter best practices whenever there is an error (i.e. current engineering model). We didn't resolve the question of whether market forces or legislation should be used to require greater reliability.
Notes from the discussion...
The EU is proposing the introduction of liability for computer programs, both commercial and Open-Source:
o required "support" period of 2 years
o "same basic rights as when they purchase a good"
Points
o Is liability a good idea?
o Restricts or enhances user choice?
o Can't find an insurance company
o Prohibitive cost for complex systems
o Reduce security problems?
o Build trust with users?
o Increase reliability?
o Stifle innovation?
o Can't see or verify quality, cannot see errors
o "Without protection, consumer software will explode"
o "Free market rules, baby!"
o "We will find a way"
o We do not want to encourage lawyers to breed
o Would you be willing to contribute to Open Source if you were
liable when someone *else* broke it?
o Would you require a particular working environment (one-true
Linux or Win2k sp4 with only X,Y, and Z drivers)?
o What level of warranty would you be willing to give (with what
restrictions)?
o Provide downgrade options to previous software?
o Fixed in some period of time?
o Money Back?
o Damages?
o Damages and Loss of Business?
o Damages and Loss of Life?
o Would a particular methodology let you provide warranties?
o TDD?
o Ada?
o Software review? (NetBSD, code-review, Mondrian)
o Robustness proofs?
o Would a "traceable" trail of liability make a difference?
o Code signing and the like
o "I want someone's arse on the line"
o I installed this little utility that might some day steal
all your data when it's updated, am I liable for that
utility's changed functionality
o Forensics and blame?
o Can we tell who caused a critical failure?
o What is a software failure worthy of a "warranty event"?
o Power failure of the machine, software doesn't run?
o Cosmic ray flips a bit
o Automated "lint" reports an error?
o Code is mal-formed (indentation wrong in C)
o Tests fail to cover an "impossible" condition (coverage ?
o Data/resource file (an icon, for instance) missing from a build?
o Slow operation? (botnet)
o Software doesn't port cleanly to exotic platform X?
o More memory used than expected?
o ERROR-level logged event?
o Error dialog raised?
o Ugly traceback appears on screen?
o User can't access the internet, so your AJAX application fails?
o Allows a botnet to be installed?
o Software core-dumps, but doesn't loose any data?
o Software core-dumps and looses data since last save?
o Software core-dumps and corrupts all data?
o Software security breached via software failure (all data
leaked)?
o Someone injured?
o Someone killed?
o What would you warrant your software to do?
o How would you specify requirements?
o All marketed feature sets?
o Would you warrant it to e.g. "take up 0 or more bytes of
disk space", with all other warranties explicitly denied?
o Does the software do what it is supposed to do?
o You can't use this software for science?
o Environmental restrictions (pre/post conditions)
o Limitation of Liability?
o Personal harm?
o Shell corporations
o Do we only warrant certain types of software?
o Hospital, 9-1-1, aerospace, etceteras
o Reasonable standard of care?
o Two-tiered software?
o Our software is warranted, it won't blow up your machine
(small green sticker)
o Our software is unwarranted, it may blow up your machine
(huge flashing LED red sticker)
o Only monopolies should be liable?
o Is software so irreducibly complex that it cannot be warranted?
o Is it just a matter of money?
o Do we only pay for what we care about? (Life-endangering, etc)
o Should an OS be considered "life-endangering" (what about
when it runs a warship?)
o Money endangering?
o Engineering software
o Best Practices as a defense strategy
o Allow any organization to prove itself non-liable by
claiming (and proving) standard best practices
o License computer users
o You can't sit down at *my* keyboard until I say so
o WGA licensing included
Links from the discussion (thanks to Seneca for collecting these):
Comments
Comments are closed.
Pingbacks
Pingbacks are closed.
Florian on 05/21/2009 9:30 a.m. #
Software represents virtual machines with more moving parts then even the most complex real machines.
Most complex real machines that come with liability and warranty attached are produced by mega-corporates on gargantuan research and development budgets (think Boing, Lockheed Martin, Nasa, Nissan, Toyota, Shell, Caterpillar etc.)
However, software is not equal software. Some software (like aeronautical, astronautical, military, nuclear reactor etc.) needs do adhere to stringent reliability standards. Other software like personal home pages, computer games, ad-blockers, screen savers etc. do not require stringent reliability requirements.
Most complex real machines are produced from parts where the reliability standards of the parts meet the reliability standards of the end product. Nobody would attempt to mass market a pencil sharpener build from milspec steel parts and machined to micrometer precision.
Indiscriminate reliability requirements against all software is entirely irrealistic and utopian. Like indiscriminate reliability requirements on cheap mass produced consumer goods.
Discriminate reliability requirements on software would require a gigantic bureaucratic machine that would come up with the categories and assign each part that software is made of some reliability requirement and enforce its observation.
Barring the destruction of software business as we know it and the impossibility of administrative effort otherwise, letting the market come up with the solution is the *only* choice we realistically have.
Federico Ceratto on 05/21/2009 2:52 p.m. #
Warranties and quality requirements are everywhere. In many countries warranties are required by law as well as minimum quality requirements for a wide range of products from the most expensive to the cheapest: planes, houses, cars, home appliances, food and clothes.
Free market is fine but most customers cannot evaluate properly the quality of what they are buying. Sometimes they lack the required knowledge, most times it's just not practical. Would you dismantle a car in pieces and spends weeks screening them before buying it?
And where it's about closed source software you are really buying a "black box".
Certainly every point you raise is debatable for years - on the other hand common sense (where available) should prevail.
I think a company or a developer shouldn't worry if their software crashes because of cosmic rays induced errors - how about selling software which contains backdoors?
Florian on 05/21/2009 5:48 p.m. #
@Federico
You can't compare real and virtual goods, they differ in several important key aspects.
Real goods are:
- finite
- have a life cycle which ends (after which they are no longer produced, sold, marketed etc.)
- saturate a market so the market only floats a finite amount of them
- are never free (due to real scarcity)
- have well defined usages
- are less complex then software
Software is:
- potentially infinite
- does never really die, even less so if source is available
- the software marketplace spawns and supports vastly more goods then any market bound by real scarcity
- rarely well defined and often has unexpected uses
- extremely complex on the level comparable to space shuttles and nuclear reactors most of the time.
I'm merely pointing out that it is entirely utopian to even attempt to regulate reliability in software space. Because if you would, you would either
1) end any and all software business as we know it OR
2) create the mother of all bureaucracies drowning in never ending work to categorize any snipplet of software ever made and enforce quality for every line of code ever written.
Both these options are not realistic. This is why they are utopian. This is also why they'll not happen.
And no, this is not infinitely debatable. The debate is short and concise if you apply rudimentary logic.