[rdo-list] TripleO UI Packaging Strategy

Mark McLoughlin markmc at redhat.com
Fri Jul 22 06:17:22 UTC 2016


On Thu, Jul 21, 2016 at 4:23 PM, Honza Pokorny <honza at redhat.com> wrote:
> There still seems to be some confusion about what we're saying, so let
> me attempt to summarize:
>
> 1. bundling of npm dependencies (sources) undesirable but temporarily tolerated
>
> 2. bundling of build toolchain even more undesirable
>
> 3. all bundling of sources tolerated temporarily
>
> 4. start working on packaging build toolchain as soon as possible
>
> A modern javascript application (both frontend and nodejs) uses npm to
> manage its dependencies and to build production/release versions from
> sources.  All of this configuration information is contained in the
> package.json file.  The "dependencies" section of that file contains a
> list of direct application dependencies; the "devDependencies" section
> contains a list of build/minification dependencies.  Typically, one will
> run "npm install" to fetch all dependencies: these will be placed in the
> node_modules/ directory --- npm downloads sources along with artifacts
> (e.g. if the package is written in coffee-script, it will contain both
> the coffee-script sources and the compiled js).  And, we plan to use npm
> to also build the minified code (e.g. "npm run build").
>
> What I propose to do is:
>
> 1.  On release, run "npm install" to bring in all dependencies
> 2.  Create a tarball of node_modules
> 3.  Run "npm pack" to create a release package

To be clear, this should be the tarball released by the TripleO
project. RDO packages upstream sources, and here you're creating new
source artifacts, which should also come from upstream.

I think it's possibly acceptable as a first step, but don't be
surprised if and when people declare this the ugliest thing they've
ever seen :)

Where would 'npm install' be run? In upstream infra CI or a developer's laptop?

> 4.  The tripleo-ui RPM spec will receive the package from 3. as Source0

You would also need to verify that you can add Patch/%patch statements
easily to patch any of the code included.

If, in order to fix a bug, you need to ask upstream in 18 months to
generate a new tarball with whatever 'npm install' happens to find on
the internet at that time, you no longer have the ability to do an
isolated bugfix.

Thanks,
Mark.


> 5.  The RPM build system has all of the application and build
>     dependencies in the node_modules.tar.gz file; it can build minified
>     dist files (release, production) without internet access
>
> Once we have this system in place and we can actually ship our code, we
> can start working on unbundling those dependencies (i.e removing them
> from the node_modules tarball, and publishing them as rpms that the
> tripleo-ui.spec can request).
>
> How does this sound?  Do I understand this correctly?
>
> Honza Pokorny
>
> On 2016-07-19 17:34, Haïkel wrote:
>> 2016-07-19 16:53 GMT+02:00 Honza Pokorny <hpokorny at redhat.com>:
>> > On 2016-07-19 16:29, Haďkel wrote:
>> >> 2016-07-19 16:08 GMT+02:00 Florian Fuchs <flfuchs at redhat.com>:
>> >> >
>> >> >
>> >> > So a couple of questions, assuming/suggesting the following workflow:
>> >> >
>> >> >
>> >> > 1. In a first step, we make sure the build tools and dependencies exist as
>> >> > node modules on the build system, so it can compile the target JS files from
>> >> > it. We make sure all dependencies have compatible licenses.
>> >> >
>> >> > 2. On each new tripleo-ui release, the build system compiles new target JS
>> >> > files using the dependencies from step 1.
>> >> >
>> >> > 3. The build system adds the compiled files to the new package (which is
>> >> > otherwise based on the tripleo-ui distgit repo).
>> >> >
>> >> >
>> >> > Questions:
>> >> >
>> >> > - Is this workflow plausible/acceptable/feasible?
>> >> >
>> >>
>> >> Yes, though I'm not sure what you understand as build system.
>> >> Build system has no internet access, and except baseOS, only things
>> >> available are provided sources + dependencies declared as
>> >> BuildRequires (must be packaged)
>> >
>> > I guess this is the biggest problem: our build system is responsible for
>> > both fetching any dependencies and for actually building the project.
>> > Preferably, we'd like to run "npm install" from the RPM spec.  This step
>> > requires internet access, and therefore violates one of the rules. Since
>> > our dependencies are fixed/pinned, there is a fair degree of certainty
>> > that the dependencies will be the same each time the build is run[1].
>> > Npm install brings in sources, not artifacts.
>> >
>>
>> There's an ongoing work to integrate language "native" packages with
>> RPM ecosystem (so-called modularity workgroup in Fedora)
>> In the future, build system will have access to an internal npm
>> registry mirror (and pypi  for python, etc.), but that's still an
>> ongoing work and there are many issues to solve (notably
>> reproducibility).
>>
>> For now, we either have to package dependencies or bundle them. Our
>> build and delivery infrastructure is provided by CentOS, these are not
>> constraints that we can bypass, nor we can maintain on our own.
>>
>> The landscape of software engineering was very different when
>> GNU/Linux packaging and distributions were created, and it'll take
>> time to adapt to modern software engineering (for the better or the
>> worst)
>>
>> > What would be your preferred solution?  Should we try and use xstatic?
>> > We could also bundle our dependencies along with the source tarball ---
>> > would that ease your mind?
>> >
>>
>> xstatic would be nice but I'd prefer that you check with our horizon
>> developers, first. They have more experience on that topic, especially
>> Mattias who also maintained horizon packaging.
>> Yes, direct dependencies can be bundled, we can tolerate minification
>> toolchain (provided it complies with our licensing terms) bundling but
>> that'd be temporary exception.
>>
>> > [1]: Yes, I know about the leftpad fiasco :)
>> >
>> >>
>> >> > - If it is, would that flow be good acceptable for now only, or even
>> >> >  permanently, given that all sources are free software and the build is
>> >> >  transparent and reproducible?
>> >> >
>> >>
>> >> That's the stable state we want to reach.
>> >> Depending the amount of work needed, we may tolerate temporary
>> >> exceptions, but they have to be approved by RDO maintainers in our
>> >> weekly meeting.
>> >>
>> >> > - Would version changes for dependencies have to be reviewed separately or
>> >> >  could they be updated with each new build based on the version information
>> >> >  in the upstream repo's package.json file?
>> >> >
>> >>
>> >> Process is:
>> >> * initial review when you introduce new package (except if it's
>> >> packaged in Fedora as they have exemption from legal team)
>> >> * update bumps are done directly without peer reviewing (well release
>> >> wranglers and CI are checking sanity)
>> >>
>> >> > - Could steps 1. and 2. be combined, so tools and dependencies are updated
>> >> >  and installed on each new release? (Assuming dependency changes are
>> >> >  reviewed beforehand.)
>> >> >
>> >> >
>> >> > Thanks for clarifying!
>> >> > Florian
>> >> >
>> >>
>> >> Up to a certain extent, while we tolerate bundling web assets, I
>> >> prefer that we don't bundle the toolchain and try to keep it stable.
>> >> Not that we'd enforce strict constraints on that, but remember that we
>> >> have limited ressources to maintain the whole distribution.
>> >>
>> >> Regards,
>> >> H.
>> >>
>> >> _______________________________________________
>> >> rdo-list mailing list
>> >> rdo-list at redhat.com
>> >> https://www.redhat.com/mailman/listinfo/rdo-list
>> >>
>> >> To unsubscribe: rdo-list-unsubscribe at redhat.com
>
> _______________________________________________
> rdo-list mailing list
> rdo-list at redhat.com
> https://www.redhat.com/mailman/listinfo/rdo-list
>
> To unsubscribe: rdo-list-unsubscribe at redhat.com




More information about the dev mailing list