Universal package specification

Eugene Gorodinsky e.gorodinsky at gmail.com
Sun Nov 29 07:18:48 PST 2009


2009/11/29 martin f krafft <madduck at madduck.net>:
> also sprach Eugene Gorodinsky <e.gorodinsky at gmail.com> [2009.11.29.1049 +0100]:
>> > Do you have any estimation how long it would take for the time
>> > invested into development, deployment, and bug fixing to be
>> > amortised?
>> >
>> No, I don't unfortunately. That would require at least calculating the
>> time invested by maintainers now, and estimation of that time doesn't
>> seem feasible.
>
> Well, I think you'll need to do a cost-benefit analysis.
>
I might be wrong, but I do not think a cost-benefit analysis is
relevant in this case.

When you're investing money into a product, you might want to do a
cost-benefit analysis. To know whether such product is worth investing
money into. When you're investing money in your education you don't do
a cost-benefit analysis, because you know that at some point it will
pay off. It's the same here, except that instead of money you're
investing time, and instead of education you get a more efficient
system. At least that's how I see it. I might be wrong and this system
might not be more efficient, if so then I would like to know where I'm
wrong.

>> > What if a package depends on vim, or apache, or rsync?
>> >
>> For third-party plugins and for packages that allow for
>> third-party plugins there is a "base" field and a "plugins" field
>> in the specification. If that is not sufficient, the package can
>> be provided in a distribution-specific format as it probably is
>> provided now. If that does not work we can still use tgz
>> + instructions how to install. It's also possible to let package
>> vendors define their own interfaces, although I'm not sure if it's
>> a good idea.
>
> I don't see the gain for any distro.
>
In the best case scenario - no work needs to be done to get certain
software to work in a certain distro. In the worst case scenario no
more work needs to be done than is already being done by the package
maintainers. Moreover, bug reports in the best case scenario will be
contained in one bug database, and bugfixes will be applied to
upstream.

>> > Symbol versioning means that each symbol has a version and your
>> > application may well work with older versions of a library if the
>> > symbols it uses have not been changed (and ABI changes only
>> > involved e.g. new symbols).
>> >
>> > http://wiki.debian.org/Projects/ImprovedDpkgShlibdeps
>> >
>> I'm only aware of one such library - glibc. I don't know why this is
>> done though. Which other libraries do this?
>
> zlib, gcrypt, and others, in Debian at least.
>
Ok, thanks.

>> >> > All the data are maintained in a single location. They are then
>> >> > cached in multiple locations. That's not duplication, IMHO.
>> >> >
>> >> Maintained - yes. But I meant the actual distribution
>> >
>> > Who cares about cached data in the "actual distribution"?
>> >
>> Sorry, I meant the process of distributing files. Bandwidth is still
>> an issue, and that way you can download less.
>
> Not if they are compressed, which usually they are; compression
> means duplicated information takes a little to no extra space.
>
I'm not touting this as an advantage, it simply seems logical to do
this if you want to conserve space/bandwidth.


More information about the Distributions mailing list