Re: [Rdo-list] Glance packaging and RDO Kilo
by Alan Pevec
Moving discussion to rdo-list
> I have spent some time during the weekend thinking about the options here.
> Looking at the requirements from all parties, I see the following:
> a) From the packaging side, we want to split Glance into several packages (glance,-common,-api and -registry).
> b) From the deployment side, we want the glance package to behave as it did before, i.e. pull -api and -registry.
> c) From the puppet-glance side, if we have separate -api and -registry packages, we want to reflect that change in the Puppet modules and be able to configure -api and -registry independently. Also, this package split is already happening in Debian/Ubuntu, so removing distro-specific code is always welcome.
>
> With that in mind, I think the following options as the easiest ones to implement:
>
> 1- Split packages, with the following deps:
>
> * -api and -registry depend on -common
> * glance depends on -api and -registry
>
> This would require moving the existing content in glance (/usr/bin/glance-manage and /usr/bin/glance-control) into -common, so glance becomes a meta-package. With this, we would get b) and c), and most of a). The only drawback is that glance-manage and glance-control may not be a good fit for the -common package (Haikel, can you comment on this?). FWIW, this is how it is being packaged today in Debian.
>
> 2- Keep the old situation (no Glance package split)
>
> This obviously negates a), and keeps distro-specific code in c), but still works and does not break any existing code.
>
> Any thoughts?
>
> Regards,
> Javier
Thanks for the summary Javier, 1) is the right thing to do. For the
record, history of this change was:
* https://review.gerrithub.io/229724 - Split openstack-glance into new
subpackages
* https://review.gerrithub.io/229980 - Backward compatiblity with
previous all-in-one main package (quickfix after I've seen Packstack
failures, in retrospect that were I should've introduced -common with
glance-manage as you propose)
** in the meantime, puppet-glance was adjusted to take advantage of
the subpackages: https://review.openstack.org/172440 - Separate api
and registry packages for Red Hat
* https://review.gerrithub.io/230356 - Revert dependencies between
services and the main packages (followup, b/c after puppet-glance
change glance-manage was not getting installed)
* https://review.gerrithub.io/230453 - Revert "Split openstack-glance
into new subpackage" (merged to unblock tripleo)
** https://review.openstack.org/174872 - Revert "Separate api and
registry packages for Red Hat" in puppet-glance
So the plan is to re-propose "Split openstack-glance into new
subpackages" and merged only after it's verified by all interested
teams and then re-propose "Separate api and registry packages for Red
Hat" in puppet-glance.
Cheers,
Alan
9 years, 7 months
[Rdo-list] Not connect to MongoDB
by Luis Gutierrez
hi,
I need to install OpenStack all in one on centos 7, but an error occurred
while configuring the installation MongoDB.
the problem is that you can not connect to MongoDB, in the attached image
detail.
if they have a patch I would like to know how to install it please
greetings and thanks
[image: Imágenes integradas 1]
--
*Luis Gutiérrez*
9 years, 7 months
[Rdo-list] 10 days of rdo manager
by Mohammed Arafa
Hello all. I am currently transitioning and have 10 days available to run
with testing rdo manager.
I am offering my help with testing and documenting as needed.
What do you guys need?
9 years, 7 months
Re: [Rdo-list] 10 days of rdo manager
by James Slagle
On Mon, Apr 20, 2015 at 03:48:51PM -0700, Dan Sneddon wrote:
> On 04/20/2015 03:42 PM, James Slagle wrote:
> > On Mon, Apr 20, 2015 at 06:49:27PM +0100, Pedro Sousa wrote:
> >> Hi,
> >>
> >> I haven't played with it yet, but I would like to know what happens to
> >> overcloud nodes when you reboot/loose or break your undercloud node.
> >>
> >> Do overcloud nodes loose ip connectivity? My understanding is that
> >> overcloud nodes get dhcp from Neutron. Or do I need to have some HA for
> >> undercloud in place?
> >
> > With the network architecture we're moving towards, overcloud nodes will only
> > get dhcp from Neutron for the provisioning network. The api, data, storage, etc
> > network will support static IP configuration, or possibly, non-Neutron provided
> > dhcp.
> >
> > Further, after initial provisioning, overcloud nodes will boot off the local
> > disk instead of pxe booting via Neutron on subsequent reboots. localboot
> > support is a relatively new feature in upstream Ironic, and we'll be enabling
> > it soon in rdo-manager.
> >
> > With these changes, when the undercloud is stopped or goes down unexpectedly
> > the overcloud would be unaffected. That being said, we still plan to have an HA
> > undercloud at some point in the future.
> >
> > Also, the current virt-setup that allows testing rdo-manager via deploying the
> > undercloud and overcloud all on vm's still relies on the undercloud vm to
> > continue to run for connectivity to overcloud nodes. That could also be
> > enhanced though to not require the undercloud vm to stay up.
> >
> >
>
> When a node is configured for local boot, does it get a static IP
> address? Maybe it turns the DHCP address into a static? Or does it
> still rely on the undercloud for DHCP?
The interface connected to the provisioning network will still get dhcp from
neutron on the undercloud if that interface is configured to start on boot, and
configured to try dhcp. Since we're currently including the
dhcp-all-interfaces[1] element in our image builds, that will indeed be the case.
If not, it wouldn't get any IP address. It doesn't seem like you'd want to
configure it with a static IP.
[1] https://github.com/openstack/diskimage-builder/tree/master/elements/dhcp-...
Note we've had a few requests to not default to including this element, or at
least it make it configurable not to include it.
--
-- James Slagle
--
9 years, 7 months
[Rdo-list] RDO/OpenStack meetups coming up (Monday, April 20, 2015)
by Rich Bowen
The following are the meetups I'm aware of in the coming week where
OpenStack and/or RDO enthusiasts are likely to be present. If you know
of others, please let me know, and/or add them to
http://rdoproject.org/Events
If there's a meetup in your area, please consider attending. If you
attend, please consider taking a few photos, and possibly even writing
up a brief summary of what was covered.
--Rich
* Monday, April 20 in Paris, FR: Meetup#14 Placement intelligent et SLA
avancé avec Nova Scheduler -
http://www.meetup.com/OpenStack-France/events/221773375/
* Monday, April 20 in Guadalajara, MX: Tools for fullstack:
"Especificacion de requerimientos" -
http://www.meetup.com/OpenStack-GDL/events/221741194/
* Monday, April 20 in Cheltenham, E6, GB: Openstack Night -
http://www.meetup.com/Cheltenham-GeekNights/events/220929989/
* Tuesday, April 21 in Melbourne, AU: OpenStack Conference, part of the
CONNECT Show -
http://www.meetup.com/Australian-OpenStack-User-Group/events/220314515/
* Tuesday, April 21 in King of Prussia, PA, US: [Tech Talk] OpenStack
101 - http://www.meetup.com/ValleyForgeTech/events/210471742/
* Tuesday, April 21 in Stuttgart, DE: Zweites spontanes und informelles
Treffen zum Erfahrungsaustausch -
http://www.meetup.com/OpenStack-Baden-Wuerttemberg/events/221970705/
* Wednesday, April 22 in Amersfoort, NL: Canonical Ubuntu OpenStack
Roadshow - http://www.meetup.com/Openstack-Netherlands/events/221727218/
* Wednesday, April 22 in New York, NY, US: Deploying OpenStack with
Mirantis FUEL/ Billing and Metering with Talligent -
http://www.meetup.com/OpenStack-New-York-Meetup/events/220648431/
* Wednesday, April 22 in Durham, NC, US: Bonus April Meetup: OpenStack
Storage Projects & An Overview of Open vStorage -
http://www.meetup.com/Triangle-OpenStack-Meetup/events/221194351/
* Wednesday, April 22 in Athens, GR: Data storage in clouds -
http://www.meetup.com/Athens-OpenStack-User-Group/events/219017094/
* Wednesday, April 22 in Istanbul, TR: OpenStack’te farkli mimari
ornekleri, farklari, artilari, eksileri -
http://www.meetup.com/Turkey-OpenStack-Meetup/events/221151225/
* Thursday, April 23 in Philadelphia, PA, US: Deploying OpenStack with
Mirantis FUEL/ Billing and Metering with Talligent -
http://www.meetup.com/Philly-OpenStack-Meetup-Group/events/220648495/
* Thursday, April 23 in Denver, CO, US: Ceph, A Distributed Object Store
and File System -
http://www.meetup.com/Distributed-Computing-Denver/events/220642902/
* Thursday, April 23 in Pasadena, CA, US: Highly Available, Performant,
VXLAN Service Node. The April OpenStack LA Meetup. -
http://www.meetup.com/OpenStack-LA/events/221553823/
* Thursday, April 23 in Berlin, DE: Infracoders Berlin Meetup - CI/CD
with the OpenStack Infra Project -
http://www.meetup.com/Infracoders-Berlin/events/220873576/
* Saturday, April 25 in Bangalore, IN: OpenStack India Meetup, Bangalore
- http://www.meetup.com/Indian-OpenStack-User-Group/events/221391632/
* Saturday, April 25 in Beijing, CN: 当OpenStack碰到Docker -
http://www.meetup.com/China-OpenStack-User-Group/events/221807891/
* Sunday, April 26 in Xian, CN: OpenStack 西安 Meet Up April -
http://www.meetup.com/Xian-OpenStack-Meetup/events/221926606/
--
Rich Bowen - rbowen(a)redhat.com
OpenStack Community Liaison
http://rdoproject.org/
9 years, 7 months
[Rdo-list] rdo-manager installs failing at horizon's manage.py step
by James Slagle
FYI, all rdo-manager installs are currently failing during the Undercloud
installation with:
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: CommandError: An error occured during rendering /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/overview/index.html: Error evaluating expression:
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: $brand-danger
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns:
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: From /usr/share/openstack-dashboard/static/dashboard/scss/_variables.scss:1
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: ...imported from <string u'/* Additional CSS for infrastructure. */\n@import "'...>:0
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: Traceback:
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: File "/usr/lib64/python2.7/site-packages/scss/expression.py", line 130, in evaluate_expression
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: return ast.evaluate(self, divide=divide)
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: File "/usr/lib64/python2.7/site-packages/scss/expression.py", line 359, in evaluate
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: raise SyntaxError("Undefined variable: '%s'." % self.name)
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: SyntaxError: Undefined variable: '$brand-danger'.
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: Found 'compress' tags in:
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/horizon/templates/horizon/_conf.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/overview/index.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/overview/undeploy_confirmation.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/overview/scale_out.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/overview/deploy_confirmation.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/_workflow_base.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_boxes/templates/tuskar_boxes/overview/index.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/nodes/register.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/overview/post_deploy_init.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/_fullscreen_workflow_base.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/horizon/templates/horizon/_scripts.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/nodes/index.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/share/openstack-dashboard/openstack_dashboard/templates/_stylesheets.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/base.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/base_detail.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: /usr/lib/python2.7/site-packages/tuskar_ui/infrastructure/templates/infrastructure/nodes/detail.html
16:26:11 Notice: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]/returns: Compressing...
16:26:11 Error: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]: Failed to call refresh: /usr/share/openstack-dashboard/manage.py compress returned 1 instead of one of [0]
16:26:11 Error: /Stage[main]/Horizon/Exec[refresh_horizon_django_cache]: /usr/share/openstack-dashboard/manage.py compress returned 1 instead of one of [0]
Initial investigation indicates it's probably related to this Horizon packaging
change from Friday:
https://review.gerrithub.io/#/c/230349/
It's look like that packaging change was included in the updated:
http://trunk.rdoproject.org/centos70/latest-RDO-trunk-CI/
--
-- James Slagle
--
9 years, 7 months