[Asterisk-Users] Re: Advice on OS Choice

Joe Greco jgreco at ns.sol.net
Fri Oct 15 19:49:40 MST 2004


> On Friday 15 October 2004 17:22, Joe Greco wrote:
> > > If they're sharp cookies then they're also smart enough to know that they
> > > are liable if they fuck something like that up.  Having access to the
> > > source is a red herring in this case.  They could just as easily tinker
> > > with hex dumps and try to make it work.
> >
> > Giving them the source is like a road map to the system.
> 
> Nothing wrong with that, IMO.  You are placing the maintenance of hte machine 
> in their hands.  Even without source there are some real trust issues at 
> stake.

Correct.  That's the point.  Handing out the source increases the severity
of the trust issues.

> > With hex dumps, it's a lot more difficult for them to bypass your redundant
> > system integrity checks.  Not impossible, but a lot more difficult.  It
> > means you need to put some serious effort into the reverse engineering.
> 
> You use hardware checks and as I put in another reply, the hardware keys to do 
> the upgrade are in the hands of someone who *is* responsible.  Hell have the 
> firmware upgrade code on a cryptokey that someone with authority needs to 
> insert.  Can't hack around that.
> 
> Oh but wait, they could already write their own firmware loader and use that.
> 
> Reductio ad absurdam.  I'm not playing that game with you.  

There's always a way.  The question is, do you make it easy by giving them
detailed technical information about the implementation of the system (that
would be "source code"), or do you make them /work/ at it?

> > If you give them source, it's more a matter of "grab the appropriate
> > compiler off the 'net and have at it".
> 
> Again where's the upgrade/install policy that wasn't being followed?  This 
> isn't a software or even a licensing issue.

You're sure right about that.  When someone dies as a result, the question 
of where's the policy becomes irrelevant in the bigger scheme of things.

> > Now what happens next is even worse, because the electronics shop guy who
> > did this, in a very human gesture of "CYA", replaces the modified image
> > with the factory image, because the first thing they did was to send the
> > unit down to the shop as defective.
> 
> > Are they liable?  Well, of course they are, if you can prove it.  This
> > requires that someone be clued in to the possibility that this happened.
> 
> Your scenario can be played out any number of ways, with or without source.  
> You routinely send your life critical hardware down to Bob and Doug's repair 
> shop?  You have bigger issues in place.

What are we talking about "Bob and Doug's repair shop" for?

I'm talking about a manufacturer who sells medical monitoring equipment to
a major hospital campus.  The campus WILL and DOES have its own electronics
shop, with technicians who know the business end of a soldering iron, and
frequently people who are damn sharp with computers as well, because they
are almost always burdened with the complexities of fixing and maintaining
a weird custom mishmash of equipment and networks required by their
employer.

What the customer does with the hardware is the issue.  Unless you, as a
manufacturer, can stand guard over that equipment, 24/7/365, it *will*
leave your sight.  Once that happens, you're never entirely certain about
what happens to it, and no amount of wishing will change that.

> > Under a closed source model, this kind of thing is generally considered
> > highly unlikely to virtually impossible, because the equipment in question
> > runs a variety of integrity checks to make sure that the program image has
> > not been altered (most frequently due to the storage medium going bad).
> 
> True enough but the system integrity checks would also include autoshutoff due 
> to alarm condition, and likely in hardware as well if at all possible. 

Too complex.  You can do that on an embedded hardware platform - sometimes.
You can't really do that on a platform running a large modern operating
system like UNIX.  So you make it as highly robust as feasible, and you
minimize your risk profile.

Your custom init executable (which is the first point of entry) does basic 
things like making sure it is process ID 1, and that no other processes 
are running.  Then you start verifying executable images, and once you're 
happy, you start mounting filesystems and launching other parts of the
system, all of which interlock with your custom init.

Problem:  shop tech has a copy of your init source code, and works out how
to compile his own, without the checks, and replaces the executable.  One
prong of your integrity check is now compromised.

And the GPL was the cause of that, because the init was derived from a 
GPL'd init.

(Well, in fact, it wasn't, it was a very cool, very fast C program written
from scratch...  I've actually written several init implementations over
the years, lots of fun, heh)

So the question, again, is *why* did that code need to be distributed?
Certainly the community gains nothing of value from it.

> There's a million ways to play this, and I'm sorry, but having access to the 
> source (or rather the build environment, which is *not* covered by GPL 
> licensing) is *not* part of the problem.  Institutional policy exists for a 
> reason, and having the source handy doesn't mean you can disregard policy, no 
> matter how trivial the change.

Again, that's not something that we - as a manufacturer - would have had
control over, even if it were relevant, which it isn't, really.

To put it another way:  Make me a Linux or FreeBSD server.  Publish the 
source you used for the server.  Now secure it such that there's no way
for me to modify the system, while still maintaining viability of things 
such as repairing the system (i.e. no encasing the system in concrete).

Remember that my most likely course of attack will be to simply remove
the drive and install it on another host system.

> > Of course, but that doesn't always translate as you'd hope.
> 
> > You can have all the nifty policies you want, but there's always the person
> > who thinks they're smarter than the policy.  Frequently they might even be
> > right, because many policies are moronically stupid.  Unfortunately, these
> > people tend to learn to ignore policies and do as they please anyways.
> 
> Then you change the policy.  People die because policy isn't followed all the 
> time.  Hell they dropped a quarter billion dollar satellite because policy 
> wasn't followed.  You fix the problem, not point to ways to make it harder to 
> abuse the policy.

Again, policy is not the manufacturer's to dictate.  I agree with policies 
such as "thou shalt only run blessed images".  But how does that policy stop
some hospital shop monkey with more smarts than common sense from doing
something "clever"?

You stop that kind of stuff by employing technology to defend against it
happening.  You can't really do that, though, if the system has been GPL'd
and the shop monkey knows your angle by reading the source.

The standard security rules apply.

Doors are nice, but can always be broken.

A child can rip a screen door.  An average person can kick through a
hollow core door.  Me and my steel toed boots can kick down a solid core
door.  I can open a steel door with a crowbar.  I can blow a bank vault
door with sufficient explosives.  And given a large military force, I can
trivially take Fort Knox.  There's no keeping a determined intruder out,
if he's got appropriate resources.

You don't want to give him resources like source code.

> > (Incidentally, this is part of why the LGPL was introduced, so even
> > the so-called FSF has tipped its hat to these legitimate concerns.)
> 
> And again, having access to the source is what's at stake here, not the build 
> environment, which would have made any of these scenarios work.

Unless you go to some great effort to make the source uncompilable without
the build environment, that's pretty much saying nothing.  I'll also note
that I've frequently compiled software designed to be compiled via gmake on
systems without gmake by alternating iterative attempts with reading the
Makefile, which would be a simple counterexample showing that you don't
necessarily need the intended build environment to compile sources.

... JG
-- 
Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net
"We call it the 'one bite at the apple' rule. Give me one chance [and] then I
won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN)
With 24 million small businesses in the US alone, that's way too many apples.



More information about the asterisk-users mailing list