Notes from our meeting today:
* We have outlined three timelines for handling dependencies
a) short term - bundle everything
b) long term - unbundle toolchain
c) very long term - use new npm registry
* Currently, the build system uses nodejs 0.10.x, let's look at
upgrading that
* Upstream CI should run selenium tests and other smoke tests
* Related patches:
* openstack infra -
* once these are merged:
* open a request for review in rdo
* create a distgit repo for tripleo-ui
Feel free to add to this
Thanks
Honza Pokorny
On 2016-07-21 17:47, Haïkel wrote:
2016-07-21 16:23 GMT+02:00 Honza Pokorny <honza(a)redhat.com>:
> There still seems to be some confusion about what we're saying, so let
> me attempt to summarize:
>
> 1. bundling of npm dependencies (sources) undesirable but temporarily tolerated
>
> 2. bundling of build toolchain even more undesirable
>
> 3. all bundling of sources tolerated temporarily
>
> 4. start working on packaging build toolchain as soon as possible
>
> A modern javascript application (both frontend and nodejs) uses npm to
> manage its dependencies and to build production/release versions from
> sources. All of this configuration information is contained in the
> package.json file. The "dependencies" section of that file contains a
> list of direct application dependencies; the "devDependencies" section
> contains a list of build/minification dependencies. Typically, one will
> run "npm install" to fetch all dependencies: these will be placed in the
> node_modules/ directory --- npm downloads sources along with artifacts
> (e.g. if the package is written in coffee-script, it will contain both
> the coffee-script sources and the compiled js). And, we plan to use npm
> to also build the minified code (e.g. "npm run build").
>
> What I propose to do is:
>
> 1. On release, run "npm install" to bring in all dependencies
> 2. Create a tarball of node_modules
> 3. Run "npm pack" to create a release package
> 4. The tripleo-ui RPM spec will receive the package from 3. as Source0
> 5. The RPM build system has all of the application and build
> dependencies in the node_modules.tar.gz file; it can build minified
> dist files (release, production) without internet access
>
> Once we have this system in place and we can actually ship our code, we
> can start working on unbundling those dependencies (i.e removing them
> from the node_modules tarball, and publishing them as rpms that the
> tripleo-ui.spec can request).
>
> How does this sound? Do I understand this correctly?
>
> Honza Pokorny
>
Looks ok as a first step.
Difficulty is that as long as we have this bundling, we won't be able
to build TripleO UI in DLRN (except by introducing a specific
workaround).
H.
> On 2016-07-19 17:34, Haïkel wrote:
>> 2016-07-19 16:53 GMT+02:00 Honza Pokorny <hpokorny(a)redhat.com>:
>> > On 2016-07-19 16:29, Haďkel wrote:
>> >> 2016-07-19 16:08 GMT+02:00 Florian Fuchs <flfuchs(a)redhat.com>:
>> >> >
>> >> >
>> >> > So a couple of questions, assuming/suggesting the following
workflow:
>> >> >
>> >> >
>> >> > 1. In a first step, we make sure the build tools and dependencies
exist as
>> >> > node modules on the build system, so it can compile the target JS
files from
>> >> > it. We make sure all dependencies have compatible licenses.
>> >> >
>> >> > 2. On each new tripleo-ui release, the build system compiles new
target JS
>> >> > files using the dependencies from step 1.
>> >> >
>> >> > 3. The build system adds the compiled files to the new package
(which is
>> >> > otherwise based on the tripleo-ui distgit repo).
>> >> >
>> >> >
>> >> > Questions:
>> >> >
>> >> > - Is this workflow plausible/acceptable/feasible?
>> >> >
>> >>
>> >> Yes, though I'm not sure what you understand as build system.
>> >> Build system has no internet access, and except baseOS, only things
>> >> available are provided sources + dependencies declared as
>> >> BuildRequires (must be packaged)
>> >
>> > I guess this is the biggest problem: our build system is responsible for
>> > both fetching any dependencies and for actually building the project.
>> > Preferably, we'd like to run "npm install" from the RPM spec.
This step
>> > requires internet access, and therefore violates one of the rules. Since
>> > our dependencies are fixed/pinned, there is a fair degree of certainty
>> > that the dependencies will be the same each time the build is run[1].
>> > Npm install brings in sources, not artifacts.
>> >
>>
>> There's an ongoing work to integrate language "native" packages
with
>> RPM ecosystem (so-called modularity workgroup in Fedora)
>> In the future, build system will have access to an internal npm
>> registry mirror (and pypi for python, etc.), but that's still an
>> ongoing work and there are many issues to solve (notably
>> reproducibility).
>>
>> For now, we either have to package dependencies or bundle them. Our
>> build and delivery infrastructure is provided by CentOS, these are not
>> constraints that we can bypass, nor we can maintain on our own.
>>
>> The landscape of software engineering was very different when
>> GNU/Linux packaging and distributions were created, and it'll take
>> time to adapt to modern software engineering (for the better or the
>> worst)
>>
>> > What would be your preferred solution? Should we try and use xstatic?
>> > We could also bundle our dependencies along with the source tarball ---
>> > would that ease your mind?
>> >
>>
>> xstatic would be nice but I'd prefer that you check with our horizon
>> developers, first. They have more experience on that topic, especially
>> Mattias who also maintained horizon packaging.
>> Yes, direct dependencies can be bundled, we can tolerate minification
>> toolchain (provided it complies with our licensing terms) bundling but
>> that'd be temporary exception.
>>
>> > [1]: Yes, I know about the leftpad fiasco :)
>> >
>> >>
>> >> > - If it is, would that flow be good acceptable for now only, or
even
>> >> > permanently, given that all sources are free software and the
build is
>> >> > transparent and reproducible?
>> >> >
>> >>
>> >> That's the stable state we want to reach.
>> >> Depending the amount of work needed, we may tolerate temporary
>> >> exceptions, but they have to be approved by RDO maintainers in our
>> >> weekly meeting.
>> >>
>> >> > - Would version changes for dependencies have to be reviewed
separately or
>> >> > could they be updated with each new build based on the version
information
>> >> > in the upstream repo's package.json file?
>> >> >
>> >>
>> >> Process is:
>> >> * initial review when you introduce new package (except if it's
>> >> packaged in Fedora as they have exemption from legal team)
>> >> * update bumps are done directly without peer reviewing (well release
>> >> wranglers and CI are checking sanity)
>> >>
>> >> > - Could steps 1. and 2. be combined, so tools and dependencies are
updated
>> >> > and installed on each new release? (Assuming dependency changes
are
>> >> > reviewed beforehand.)
>> >> >
>> >> >
>> >> > Thanks for clarifying!
>> >> > Florian
>> >> >
>> >>
>> >> Up to a certain extent, while we tolerate bundling web assets, I
>> >> prefer that we don't bundle the toolchain and try to keep it
stable.
>> >> Not that we'd enforce strict constraints on that, but remember that
we
>> >> have limited ressources to maintain the whole distribution.
>> >>
>> >> Regards,
>> >> H.
>> >>
>> >> _______________________________________________
>> >> rdo-list mailing list
>> >> rdo-list(a)redhat.com
>> >>
https://www.redhat.com/mailman/listinfo/rdo-list
>> >>
>> >> To unsubscribe: rdo-list-unsubscribe(a)redhat.com