I am an embedded systems developer.
I grasp the reasons.
I am extremely angry over this because I lost alot of irreplaceable
work.
I am partly angry at myself - whie I did actually backup alot of the
trees I actually thought might get stomped on I clearly did not back up
everything I should have - of course this is an era of 300gb+ laptop
hard drives so insisting on complete backups is becoming increasingly
impossible.
Regardless, this is not a simple issue, and I think there is a
strong argument for a very light touch.
One of linux's appeals to me over more than a decade is the ease
with which damaged systems can be repaired - simply reinstalling overtop
of an existing system has nearly always proved equivalent to a repair.
While linux is much easier to actually repair than windows, it still can
be incredibly time consuming and often just re-installing is quicker.
Another appeal is that most anything that you think will work - that
makes sense, probably does work under Linux. I was very active in the
windows NT beta process. I got very very tired of trying to argue why
the fact that some means of solving a problem that made sense to me and
had worked in the past should continue to work, when it was clear to
them with their knowledges of the internals that there was a better way.
And honestly I am not sure that I accept that you have to delete
anything. I grasp that you must replace numerous files, that to ensure a
working and robust system you must replace libraries that are found with
those that match this particular install - it is even useful to be able
to go backwards - re-install intrepid over karmic as an example.
But off the top of my head I can think of extremely few instances
where if standards are being followed the mere presence of a file from a
previous install should destabalize a system. There are a few instances
but they seem to be unique to things like udev that act on any file
present in a given directory.
Anyway with respect to principles and policy I think the best
approach is for the installer to complain and make note of the presence
of files in system directories.
I do not care if you tell the user this is unsupported.
Finally I would note, the only reason I had to "reinstal" was
because a system crash in the middle of an apt upgrade left a system
that I had been happy with unbootable.
I also grasp that sometimes things like this happen in beta and
alpha stages - I support clients and develop software too.
But there are two issues:
The first is the bug - that has been reported and hopefully dealt
with.
The second is that there appears to be an actual intent to get
much more aggressive about file deletion.
I wish to make it clear that I am strongly opposed to that
choice. I understand there are benefits, but they are not outweighed by
the liabilities.
Colin Watson wrote:
> The purpose of cleaning up directories that might be owned by the system
> is that if we do not do this then we have no chance whatsoever of being
> able to support the resulting conjoined-twin system. We need to clear
> out anything put there by the previous *operating system*, while
> preserving *user data*. It's a delicate balancing act and tends to
> involve a lot of special cases; when we miss a case we get bugs like
> these.
>
> ** Package changed: ubiquity (Ubuntu) => partman-target (Ubuntu)
>
> ** Also affects: partman-target (Ubuntu Karmic)
> Importance: High
> Status: Confirmed
>
> ** Changed in: partman-target (Ubuntu Karmic)
> Milestone: None => ubuntu-9.10
>
> ** Changed in: partman-target (Ubuntu Karmic)
> Assignee: (unassigned) => Evan Dandrea (evand)
>
>
--
Dave Lynch DLA Systems
Software Development: Embedded Linux
717.627.3770 <email address hidden> http://www.dlasys.net
Cell: 1.717.587.7774
Over 25 years' experience in platforms, languages, and technologies too numerous to list.
"Any intelligent fool can make things bigger and more complex... It takes a touch of genius - and a lot of courage to move in the opposite direction."
Albert Einstein
I am an embedded systems developer.
I grasp the reasons.
I am extremely angry over this because I lost alot of irreplaceable
work.
I am partly angry at myself - whie I did actually backup alot of the
trees I actually thought might get stomped on I clearly did not back up
everything I should have - of course this is an era of 300gb+ laptop
hard drives so insisting on complete backups is becoming increasingly
impossible.
Regardless, this is not a simple issue, and I think there is a
strong argument for a very light touch.
One of linux's appeals to me over more than a decade is the ease
with which damaged systems can be repaired - simply reinstalling overtop
of an existing system has nearly always proved equivalent to a repair.
While linux is much easier to actually repair than windows, it still can
be incredibly time consuming and often just re-installing is quicker.
Another appeal is that most anything that you think will work - that
makes sense, probably does work under Linux. I was very active in the
windows NT beta process. I got very very tired of trying to argue why
the fact that some means of solving a problem that made sense to me and
had worked in the past should continue to work, when it was clear to
them with their knowledges of the internals that there was a better way.
And honestly I am not sure that I accept that you have to delete
anything. I grasp that you must replace numerous files, that to ensure a
working and robust system you must replace libraries that are found with
those that match this particular install - it is even useful to be able
to go backwards - re-install intrepid over karmic as an example.
But off the top of my head I can think of extremely few instances
where if standards are being followed the mere presence of a file from a
previous install should destabalize a system. There are a few instances
but they seem to be unique to things like udev that act on any file
present in a given directory.
Anyway with respect to principles and policy I think the best
approach is for the installer to complain and make note of the presence
of files in system directories.
I do not care if you tell the user this is unsupported.
Finally I would note, the only reason I had to "reinstal" was
because a system crash in the middle of an apt upgrade left a system
that I had been happy with unbootable.
I also grasp that sometimes things like this happen in beta and
alpha stages - I support clients and develop software too.
But there are two issues:
The first is the bug - that has been reported and hopefully dealt
with.
The second is that there appears to be an actual intent to get
much more aggressive about file deletion.
I wish to make it clear that I am strongly opposed to that
choice. I understand there are benefits, but they are not outweighed by
the liabilities.
Colin Watson wrote:
> The purpose of cleaning up directories that might be owned by the system
> is that if we do not do this then we have no chance whatsoever of being
> able to support the resulting conjoined-twin system. We need to clear
> out anything put there by the previous *operating system*, while
> preserving *user data*. It's a delicate balancing act and tends to
> involve a lot of special cases; when we miss a case we get bugs like
> these.
>
> ** Package changed: ubiquity (Ubuntu) => partman-target (Ubuntu)
>
> ** Also affects: partman-target (Ubuntu Karmic)
> Importance: High
> Status: Confirmed
>
> ** Changed in: partman-target (Ubuntu Karmic)
> Milestone: None => ubuntu-9.10
>
> ** Changed in: partman-target (Ubuntu Karmic)
> Assignee: (unassigned) => Evan Dandrea (evand)
>
>
-- www.dlasys. net
Dave Lynch DLA Systems
Software Development: Embedded Linux
717.627.3770 <email address hidden> http://
Cell: 1.717.587.7774
Over 25 years' experience in platforms, languages, and technologies too numerous to list.
"Any intelligent fool can make things bigger and more complex... It takes a touch of genius - and a lot of courage to move in the opposite direction."
Albert Einstein