[rdo-list] [Tripleo] No hosts available after deleting stack

Samuel Monderer smonderer at vasonanetworks.com
Tue Mar 28 10:52:28 UTC 2017


Hi Dan,

When running ironic node-list it showed that all nodes are available.

Samuel

On Mon, Mar 27, 2017 at 7:54 PM Dan Sneddon <dsneddon at redhat.com> wrote:

> On 03/27/2017 07:10 AM, Samuel Monderer wrote:
> > Hi,
> >
> > After deleting a stack and running the deployment again I get No hosts
> > available for the compute node.
> > /var/log/nova/nova-scheduler shows the following
> > 2017-03-27 16:34:56.510 19784 DEBUG
> > nova.scheduler.filters.compute_capabilities_filter
> > [req-12df93c6-1226-4292-9938-e6ace00f88cc
> > eece59c584b8439e992306a27ab78eb7 5495048e618b444288911f261b2c10e2 - - -]
> > (edge-rhos-director.vasonanetworks.com
> > <http://edge-rhos-director.vasonanetworks.com>,
> > 82d68395-d33e-4332-99ff-e48daa38861a) ram: 0MB disk: 0MB io_ops: 0
> > instances: 0 fails extra_spec requirements. 'compute' does not match
> > 'control' _satisfies_extra_specs
> >
> /usr/lib/python2.7/site-packages/nova/scheduler/filters/compute_capabilities_filter.py:103
> > 2017-03-27 16:34:56.511 19784 DEBUG
> > nova.scheduler.filters.compute_capabilities_filter
> > [req-12df93c6-1226-4292-9938-e6ace00f88cc
> > eece59c584b8439e992306a27ab78eb7 5495048e618b444288911f261b2c10e2 - - -]
> > (edge-rhos-director.vasonanetworks.com
> > <http://edge-rhos-director.vasonanetworks.com>,
> > 82d68395-d33e-4332-99ff-e48daa38861a) ram: 0MB disk: 0MB io_ops: 0
> > instances: 0 fails instance_type extra_specs requirements host_passes
> >
> /usr/lib/python2.7/site-packages/nova/scheduler/filters/compute_capabilities_filter.py:112
> > 2017-03-27 16:34:56.512 19784 INFO nova.filters
> > [req-12df93c6-1226-4292-9938-e6ace00f88cc
> > eece59c584b8439e992306a27ab78eb7 5495048e618b444288911f261b2c10e2 - - -]
> > Filter ComputeCapabilitiesFilter returned 0 hosts
> > 2017-03-27 16:34:56.512 19784 DEBUG nova.filters
> > [req-12df93c6-1226-4292-9938-e6ace00f88cc
> > eece59c584b8439e992306a27ab78eb7 5495048e618b444288911f261b2c10e2 - - -]
> > Filtering removed all hosts for the request with instance ID
> > '4fa2af02-8c0b-4440-aa3b-10ffe2016d48'. Filter results: [('RetryFilter',
> > [(u'edge-rhos-director.vasonanetworks.com
> > <http://edge-rhos-director.vasonanetworks.com>',
> > u'82d68395-d33e-4332-99ff-e48daa38861a')]),
> > ('TripleOCapabilitiesFilter', [(u'edge-rhos-director.vasonanetworks.com
> > <http://edge-rhos-director.vasonanetworks.com>',
> > u'82d68395-d33e-4332-99ff-e48daa38861a')]),
> > ('ComputeCapabilitiesFilter', None)] get_filtered_objects
> > /usr/lib/python2.7/site-packages/nova/filters.py:129
> > 2017-03-27 16:34:56.513 19784 INFO nova.filters
> > [req-12df93c6-1226-4292-9938-e6ace00f88cc
> > eece59c584b8439e992306a27ab78eb7 5495048e618b444288911f261b2c10e2 - - -]
> > Filtering removed all hosts for the request with instance ID
> > '4fa2af02-8c0b-4440-aa3b-10ffe2016d48'. Filter results: ['RetryFilter:
> > (start: 2, end: 1)', 'TripleOCapabilitiesFilter: (start: 1, end: 1)',
> > 'ComputeCapabilitiesFilter: (start: 1, end: 0)']
> > 2017-03-27 16:34:56.514 19784 DEBUG nova.scheduler.filter_scheduler
> > [req-12df93c6-1226-4292-9938-e6ace00f88cc
> > eece59c584b8439e992306a27ab78eb7 5495048e618b444288911f261b2c10e2 - - -]
> > There are 0 hosts available but 1 instances requested to build.
> > select_destinations
> > /usr/lib/python2.7/site-packages/nova/scheduler/filter_scheduler.py:71
> >
> > But when check the available node for compute I get
> > [stack at edge-rhos-director ~]$ ironic node-list --provision-state
> > available --maintenance false --associated false
> >
> +--------------------------------------+---------------+---------------+-------------+--------------------+-------------+
> > | UUID                                 | Name          | Instance UUID |
> > Power State | Provisioning State | Maintenance |
> >
> +--------------------------------------+---------------+---------------+-------------+--------------------+-------------+
> > | ded46c41-c5a5-4aa7-a1ee-3df75e6cf976 | rhos-compute0 | None          |
> > power off   | available          | False       |
> >
> +--------------------------------------+---------------+---------------+-------------+--------------------+-------------+
> >
> > [stack at edge-rhos-director ~]$ ironic node-show rhos-compute0
> >
> +------------------------+-------------------------------------------------------------------------+
> > | Property               | Value
> >                           |
> >
> +------------------------+-------------------------------------------------------------------------+
> > | chassis_uuid           |
> >                           |
> > | clean_step             | {}
> >                            |
> > | console_enabled        | False
> >                           |
> > | created_at             | 2017-03-23T14:41:35+00:00
> >                           |
> > | driver                 | pxe_drac
> >                            |
> > | driver_info            | {u'deploy_kernel':
> > u'95a09fe8-919a-4c15-9f0e-c5936b6bcecf',             |
> > |                        | u'drac_password': u'******',
> > u'drac_username': u'root',                 |
> > |                        | u'deploy_ramdisk':
> > u'5c5f25c4-c9bf-4859-aa33-2e4352eb5620',             |
> > |                        | u'drac_host': u'192.168.61.7'}
> >                            |
> > | driver_internal_info   | {u'agent_url': u'http://192.0.2.9:9999',
> > u'root_uuid_or_disk_id':       |
> > |                        | u'c2bb0683-ebe8-4541-9c5c-a811d0326ae5',
> > u'is_whole_disk_image': False, |
> > |                        | u'drac_boot_device': {u'boot_device': u'pxe',
> > u'persistent': True},     |
> > |                        | u'agent_last_heartbeat': 1490597251}
> >                            |
> > | extra                  | {u'hardware_swift_object':
> > u'extra_hardware-ded46c41-c5a5-4aa7-a1ee-    |
> > |                        | 3df75e6cf976'}
> >                            |
> > | inspection_finished_at | None
> >                            |
> > | inspection_started_at  | None
> >                            |
> > | instance_info          | {}
> >                            |
> > | instance_uuid          | None
> >                            |
> > | last_error             | None
> >                            |
> > | maintenance            | False
> >                           |
> > | maintenance_reason     | None
> >                            |
> > | name                   | rhos-compute0
> >                           |
> > | network_interface      |
> >                           |
> > | power_state            | power off
> >                           |
> > | properties             | {u'memory_mb': u'131072', u'cpu_arch':
> > u'x86_64', u'local_gb': u'557',  |
> > |                        | u'cpus': u'48', u'capabilities':
> > u'profile:compute,boot_option:local'}  |
> > | provision_state        | available
> >                           |
> > | provision_updated_at   | 2017-03-27T13:34:51+00:00
> >                           |
> > | raid_config            |
> >                           |
> > | reservation            | None
> >                            |
> > | resource_class         |
> >                           |
> > | target_power_state     | None
> >                            |
> > | target_provision_state | None
> >                            |
> > | target_raid_config     |
> >                           |
> > | updated_at             | 2017-03-27T13:34:51+00:00
> >                           |
> > | uuid                   | ded46c41-c5a5-4aa7-a1ee-3df75e6cf976
> >                            |
> >
> +------------------------+-------------------------------------------------------------------------+
> >
> > and it suites the compute flavor
> > [stack at edge-rhos-director ~]$ openstack flavor show compute
> >
> +----------------------------+------------------------------------------------------------------+
> > | Field                      | Value
> >                        |
> >
> +----------------------------+------------------------------------------------------------------+
> > | OS-FLV-DISABLED:disabled   | False
> >                        |
> > | OS-FLV-EXT-DATA:ephemeral  | 0
> >                        |
> > | access_project_ids         | None
> >                         |
> > | disk                       | 40
> >                         |
> > | id                         | 1f44e78f-5121-4b8f-ab9d-19a64fb1d020
> >                         |
> > | name                       | compute
> >                        |
> > | os-flavor-access:is_public | True
> >                         |
> > | properties                 | capabilities:boot_option='local',
> > capabilities:profile='compute' |
> > | ram                        | 4096
> >                         |
> > | rxtx_factor                | 1.0
> >                        |
> > | swap                       |
> >                        |
> > | vcpus                      | 1
> >                        |
> >
> +----------------------------+------------------------------------------------------------------+
> >
> > Any Idea what could be the reason why it's not finding a suitable host?
> >
> > Samuel
> >
> >
> > _______________________________________________
> > rdo-list mailing list
> > rdo-list at redhat.com
> > https://www.redhat.com/mailman/listinfo/rdo-list
> >
> > To unsubscribe: rdo-list-unsubscribe at redhat.com
> >
>
> When I have seen this, it was because the stack didn't fully reset all
> the nodes to available (assuming the stack was fully deleted). Can you
> run "ironic node-list" at the undercloud command-line, and make sure
> that all your nodes are set to "available"? If not, you can manually set
> the provisioning mode with "ironic node-set-provision-state <UUID>
> provide" for each node. Also, make sure that the stack has been fully
> deleted ("heat stack-list"), although I don't think that is the issue here.
>
> --
> Dan Sneddon         |  Senior Principal Software Engineer
> dsneddon at redhat.com |  redhat.com/openstack
> dsneddon:irc        |  @dxs:twitter
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.rdoproject.org/pipermail/dev/attachments/20170328/76ef797e/attachment.html>


More information about the dev mailing list