On 2016-07-19 16:29, Haïkel wrote:
2016-07-19 16:08 GMT+02:00 Florian Fuchs <flfuchs(a)redhat.com>:
>
>
> So a couple of questions, assuming/suggesting the following workflow:
>
>
> 1. In a first step, we make sure the build tools and dependencies exist as
> node modules on the build system, so it can compile the target JS files from
> it. We make sure all dependencies have compatible licenses.
>
> 2. On each new tripleo-ui release, the build system compiles new target JS
> files using the dependencies from step 1.
>
> 3. The build system adds the compiled files to the new package (which is
> otherwise based on the tripleo-ui distgit repo).
>
>
> Questions:
>
> - Is this workflow plausible/acceptable/feasible?
>
Yes, though I'm not sure what you understand as build system.
Build system has no internet access, and except baseOS, only things
available are provided sources + dependencies declared as
BuildRequires (must be packaged)
I guess this is the biggest problem: our build system is responsible for
both fetching any dependencies and for actually building the project.
Preferably, we'd like to run "npm install" from the RPM spec. This step
requires internet access, and therefore violates one of the rules. Since
our dependencies are fixed/pinned, there is a fair degree of certainty
that the dependencies will be the same each time the build is run[1].
Npm install brings in sources, not artifacts.
What would be your preferred solution? Should we try and use xstatic?
We could also bundle our dependencies along with the source tarball ---
would that ease your mind?
[1]: Yes, I know about the leftpad fiasco :)
> - If it is, would that flow be good acceptable for now only, or even
> permanently, given that all sources are free software and the build is
> transparent and reproducible?
>
That's the stable state we want to reach.
Depending the amount of work needed, we may tolerate temporary
exceptions, but they have to be approved by RDO maintainers in our
weekly meeting.
> - Would version changes for dependencies have to be reviewed separately or
> could they be updated with each new build based on the version information
> in the upstream repo's package.json file?
>
Process is:
* initial review when you introduce new package (except if it's
packaged in Fedora as they have exemption from legal team)
* update bumps are done directly without peer reviewing (well release
wranglers and CI are checking sanity)
> - Could steps 1. and 2. be combined, so tools and dependencies are updated
> and installed on each new release? (Assuming dependency changes are
> reviewed beforehand.)
>
>
> Thanks for clarifying!
> Florian
>
Up to a certain extent, while we tolerate bundling web assets, I
prefer that we don't bundle the toolchain and try to keep it stable.
Not that we'd enforce strict constraints on that, but remember that we
have limited ressources to maintain the whole distribution.
Regards,
H.
_______________________________________________
rdo-list mailing list
rdo-list(a)redhat.com
https://www.redhat.com/mailman/listinfo/rdo-list
To unsubscribe: rdo-list-unsubscribe(a)redhat.com