[Rdo-list] Fwd: [heat]: stack stays interminably under the status create in progress

ICHIBA Sara ichi.sara at gmail.com
Thu May 7 14:16:06 UTC 2015


I just received this email from John and applied what he suggested and it
worked. He's right, packstack is using keystone v2 while heat try to use
keystone v3. Well, for those of you running through the same problem , here
is a hint on how to solve it.
---------- Forwarded message ----------
From: Haller, John H (John) <john.haller at alcatel-lucent.com>
Date: 2015-05-07 15:56 GMT+02:00
Subject: RE: [Rdo-list] [heat]: stack stays interminably under the status
create in progress
To: ICHIBA Sara <ichi.sara at gmail.com>


 We had a problem using waitcondition, and it turned out that Keystone V3
was not completely enabled. Not sure if this will help your problem or not,
but give it a try. If it does solve it, I’m guessing there is some missing
configuration in packstack, and feel free to pass this on to the list.

$ heat-keystone-setup-domain \

--stack-user-domain-name heat_user_domain \

--stack-domain-admin heat_domain_admin \

--stack-domain-admin-password heat_domain_password



Please update your heat.conf with the following in [DEFAULT]



stack_user_domain_id=UUID of heat_user_domain

stack_domain_admin=heat_domain_admin

stack_domain_admin_password=heat_domain_password



Restart the Heat API service and Heat engine service





Regards,

John Haller

Alcatel-Lucent

Naperville, Illinois 60563



*From:* rdo-list-bounces at redhat.com [mailto:rdo-list-bounces at redhat.com] *On
Behalf Of *ICHIBA Sara
*Sent:* Thursday, May 07, 2015 7:41 AM
*To:* Steven Hardy; rdo-list at redhat.com
*Subject:* Re: [Rdo-list] [heat]: stack stays interminably under the status
create in progress



I found this

https://bugs.launchpad.net/heat/+bug/1405110 . Apparently i'm not the only
one to have this problem



2015-05-07 14:26 GMT+02:00 ICHIBA Sara <ichi.sara at gmail.com>:





2015-05-07 14:08 GMT+02:00 ICHIBA Sara <ichi.sara at gmail.com>:

  Actually, Nova is working I just spawned a VM with  the same flavor  and
image. and when I try to do the same with heat it fails. below some logs:

 nova-compute.log

2015-05-07 13:58:56.208 3928 AUDIT nova.compute.manager
[req-b57bfc38-9d77-48fc-8185-474d1f9076a6 None] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Starting instance...
2015-05-07 13:58:56.376 3928 AUDIT nova.compute.claims [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Attempting claim: memory 512 MB, disk
1 GB
2015-05-07 13:58:56.376 3928 AUDIT nova.compute.claims [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Total memory: 3791 MB, used: 512.00 MB
2015-05-07 13:58:56.377 3928 AUDIT nova.compute.claims [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] memory limit: 5686.50 MB, free:
5174.50 MB
2015-05-07 13:58:56.377 3928 AUDIT nova.compute.claims [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Total disk: 13 GB, used: 0.00 GB
2015-05-07 13:58:56.378 3928 AUDIT nova.compute.claims [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] disk limit not specified, defaulting
to unlimited
2015-05-07 13:58:56.395 3928 AUDIT nova.compute.claims [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Claim successful
2015-05-07 13:58:56.590 3928 INFO nova.scheduler.client.report [-]
Compute_service record updated for ('localhost.localdomain',
'localhost.localdomain')
2015-05-07 13:58:56.787 3928 INFO nova.scheduler.client.report [-]
Compute_service record updated for ('localhost.localdomain',
'localhost.localdomain')
2015-05-07 13:58:57.269 3928 INFO nova.virt.libvirt.driver [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Creating image
2015-05-07 13:59:27.642 3928 INFO oslo.messaging._drivers.impl_rabbit [-]
Connecting to AMQP server on 192.168.5.33:5672
2015-05-07 13:59:27.661 3928 AUDIT nova.compute.resource_tracker [-]
Auditing locally available compute resources
2015-05-07 13:59:27.702 3928 INFO oslo.messaging._drivers.impl_rabbit [-]
Connected to AMQP server on 192.168.5.33:5672
2015-05-07 13:59:27.800 3928 INFO nova.scheduler.client.report [-]
Compute_service record updated for ('localhost.localdomain',
'localhost.localdomain')
2015-05-07 13:59:28.066 3928 AUDIT nova.compute.resource_tracker [-] Total
physical ram (MB): 3791, total allocated virtual ram (MB): 1024
2015-05-07 13:59:28.066 3928 AUDIT nova.compute.resource_tracker [-] Free
disk (GB): 12
2015-05-07 13:59:28.067 3928 AUDIT nova.compute.resource_tracker [-] Total
usable vcpus: 4, total allocated vcpus: 0
2015-05-07 13:59:28.067 3928 AUDIT nova.compute.resource_tracker [-] PCI
stats: []
2015-05-07 13:59:28.101 3928 INFO nova.scheduler.client.report [-]
Compute_service record updated for ('localhost.localdomain',
'localhost.localdomain')
2015-05-07 13:59:28.101 3928 INFO nova.compute.resource_tracker [-]
Compute_service record updated for
localhost.localdomain:localhost.localdomain
2015-05-07 13:59:47.110 3928 WARNING nova.virt.disk.vfs.guestfs [-] Failed
to close augeas aug_close: do_aug_close: you must call 'aug-init' first to
initialize Augeas
2015-05-07 13:59:51.364 3928 INFO nova.compute.manager [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] VM Started (Lifecycle Event)
2015-05-07 13:59:51.384 3928 INFO nova.virt.libvirt.driver [-] [instance:
399c6150-2877-4c5f-865f-c7ac4c5c7ed5] Instance spawned successfully.
2015-05-07 14:00:28.264 3928 AUDIT nova.compute.resource_tracker [-]
Auditing locally available compute resources
2015-05-07 14:00:29.007 3928 AUDIT nova.compute.resource_tracker [-] Total
physical ram (MB): 3791, total allocated virtual ram (MB): 1024
2015-05-07 14:00:29.008 3928 AUDIT nova.compute.resource_tracker [-] Free
disk (GB): 12
2015-05-07 14:00:29.009 3928 AUDIT nova.compute.resource_tracker [-] Total
usable vcpus: 4, total allocated vcpus: 1
2015-05-07 14:00:29.009 3928 AUDIT nova.compute.resource_tracker [-] PCI
stats: []
2015-05-07 14:00:29.048 3928 INFO nova.scheduler.client.report [-]
Compute_service record updated for ('localhost.localdomain',
'localhost.localdomain')
2015-05-07 14:00:29.048 3928 INFO nova.compute.resource_tracker [-]
Compute_service record updated for
localhost.localdomain:localhost.localdomain



 heat-engine.log

2015-05-07 14:02:45.177 3942 INFO heat.engine.service
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None] Creating stack
my_first_stack
2015-05-07 14:02:45.194 3942 DEBUG stevedore.extension
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] found extension
EntryPoint.parse('AWSTemplateFormatVersion.2010-09-09 =
heat.engine.cfn.template:CfnTemplate') _load_plugins
/usr/lib/python2.7/site-packages/stevedore/extension.py:156
2015-05-07 14:02:45.194 3942 DEBUG stevedore.extension
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] found extension
EntryPoint.parse('heat_template_version.2013-05-23 =
heat.engine.hot.template:HOTemplate20130523') _load_plugins
/usr/lib/python2.7/site-packages/stevedore/extension.py:156
2015-05-07 14:02:45.195 3942 DEBUG stevedore.extension
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] found extension
EntryPoint.parse('HeatTemplateFormatVersion.2012-12-12 =
heat.engine.cfn.template:HeatTemplate') _load_plugins
/usr/lib/python2.7/site-packages/stevedore/extension.py:156
2015-05-07 14:02:45.195 3942 DEBUG stevedore.extension
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] found extension
EntryPoint.parse('heat_template_version.2014-10-16 =
heat.engine.hot.template:HOTemplate20141016') _load_plugins
/usr/lib/python2.7/site-packages/stevedore/extension.py:156
2015-05-07 14:02:45.224 3942 DEBUG heat.engine.parameter_groups
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None]
<heat.engine.hot.template.HOTemplate20130523 object at 0x38d05d0> __init__
/usr/lib/python2.7/site-packages/heat/engine/parameter_groups.py:31
2015-05-07 14:02:45.225 3942 DEBUG heat.engine.parameter_groups
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None]
<heat.engine.hot.parameters.HOTParameters object at 0x3a695d0> __init__
/usr/lib/python2.7/site-packages/heat/engine/parameter_groups.py:32
2015-05-07 14:02:45.225 3942 DEBUG heat.engine.parameter_groups
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None] Validating Parameter
Groups. validate
/usr/lib/python2.7/site-packages/heat/engine/parameter_groups.py:43
2015-05-07 14:02:45.226 3942 DEBUG heat.engine.parameter_groups
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None] ['OS::stack_id'] validate
/usr/lib/python2.7/site-packages/heat/engine/parameter_groups.py:44
2015-05-07 14:02:45.233 3942 INFO heat.engine.resource
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None] Validating Server
"my_instance"
2015-05-07 14:02:45.385 3942 DEBUG heat.common.keystoneclient
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 None] Using stack domain
9ce7896d79914b68aef34d397fd3fde4 __init__
/usr/lib/python2.7/site-packages/heat/common/heat_keystoneclient.py:115
2015-05-07 14:02:45.405 3942 INFO urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] Starting new HTTP connection
(1): 192.168.5.33
2015-05-07 14:02:45.448 3942 DEBUG urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] "GET
/v2/fac261cd98974411a9b2e977cd9ec876/os-keypairs/userkey HTTP/1.1" 200 674
_make_request /usr/lib/python2.7/site-packages/urllib3/connectionpool.py:357
2015-05-07 14:02:45.486 3942 DEBUG glanceclient.common.http
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] curl -i -X GET -H 'User-Agent:
python-glanceclient' -H 'Content-Type: application/octet-stream' -H
'Accept-Encoding: gzip, deflate' -H 'Accept: */*' -H 'X-Auth-Token:
{SHA1}bc08e056cd78f34e9ada3dd99ceae37d33daf3f0'
http://192.168.5.33:9292/v1/images/detail?limit=20&name=cirros
log_curl_request
/usr/lib/python2.7/site-packages/glanceclient/common/http.py:122
2015-05-07 14:02:45.488 3942 INFO urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] Starting new HTTP connection
(1): 192.168.5.33
2015-05-07 14:02:45.770 3942 DEBUG urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] "GET
/v1/images/detail?limit=20&name=cirros HTTP/1.1" 200 481 _make_request
/usr/lib/python2.7/site-packages/urllib3/connectionpool.py:357
2015-05-07 14:02:45.771 3942 DEBUG glanceclient.common.http
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ]
HTTP/1.1 200 OK
date: Thu, 07 May 2015 12:02:45 GMT
content-length: 481
content-type: application/json; charset=UTF-8
x-openstack-request-id: req-c62b407b-0d8f-43c6-a8be-df02015dc846

{"images": [{"status": "active", "deleted_at": null, "name": "cirros",
"deleted": false, "container_format": "bare", "created_at":
"2015-05-06T14:27:54", "disk_format": "qcow2", "updated_at":
"2015-05-06T15:01:15", "min_disk": 0, "protected": false, "id":
"d80b5a24-2567-438f-89f8-b381a6716887", "min_ram": 0, "checksum":
"133eae9fb1c98f45894a4e60d8736619", "owner":
"3740df0f18754509a252738385d375b9", "is_public": true, "virtual_size":
null, "properties": {}, "size": 13200896}]}
 log_http_response
/usr/lib/python2.7/site-packages/glanceclient/common/http.py:135
2015-05-07 14:02:46.049 3942 DEBUG keystoneclient.auth.identity.v3
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] Making authentication request
to http://192.168.5.33:35357/v3/auth/tokens get_auth_ref
/usr/lib/python2.7/site-packages/keystoneclient/auth/identity/v3.py:117
2015-05-07 14:02:46.052 3942 INFO urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] Starting new HTTP connection
(1): 192.168.5.33
2015-05-07 14:02:46.247 3942 DEBUG urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] "POST /v3/auth/tokens HTTP/1.1"
201 8275 _make_request
/usr/lib/python2.7/site-packages/urllib3/connectionpool.py:357
2015-05-07 14:02:46.250 3942 DEBUG keystoneclient.session
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] REQ: curl -i -X POST
http://192.168.5.33:35357/v3/OS-TRUST/trusts -H "User-Agent:
python-keystoneclient" -H "Content-Type: application/json" -H
"X-Auth-Token: TOKEN_REDACTED" -d '{"trust": {"impersonation": true,
"project_id": "fac261cd98974411a9b2e977cd9ec876", "trustor_user_id":
"24f6dac3f1444d89884c1b1977bb0d87", "roles": [{"name":
"heat_stack_owner"}], "trustee_user_id":
"4f7e2e76441e483982fb863ed02fe63e"}}' _http_log_request
/usr/lib/python2.7/site-packages/keystoneclient/session.py:155
2015-05-07 14:02:46.252 3942 INFO urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] Starting new HTTP connection
(1): 192.168.5.33
2015-05-07 14:02:46.381 3942 DEBUG urllib3.connectionpool
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] "POST /v3/OS-TRUST/trusts
HTTP/1.1" 201 717 _make_request
/usr/lib/python2.7/site-packages/urllib3/connectionpool.py:357
2015-05-07 14:02:46.383 3942 DEBUG keystoneclient.session
[req-6b9bcd97-7a13-4bc3-984c-470e6f5949b9 ] RESP: [201] {'content-length':
'717', 'vary': 'X-Auth-Token', 'server': 'Apache/2.4.6 (CentOS)',
'connection': 'close', 'date': 'Thu, 07 May 2015 12:02:46 GMT',
'content-type': 'application/json'}
RESP BODY: {"trust": {"impersonation": true, "roles_links": {"self": "
http://192.168.5.33:35357/v3/OS-TRUST/trusts/96e0e32b7c504d3f8f9b82fcc2658e5f/roles",
"previous": null, "next": null}, "deleted_at": null, "trustor_user_id":
"24f6dac3f1444d89884c1b1977bb0d87", "links": {"self": "
http://192.168.5.33:35357/v3/OS-TRUST/trusts/96e0e32b7c504d3f8f9b82fcc2658e5f"},
"roles": [{"id": "cf346090e5a042ebac674bcfe14f4076", "links": {"self": "
http://192.168.5.33:35357/v3/roles/cf346090e5a042ebac674bcfe14f4076"},
"name": "heat_stack_owner"}], "remaining_uses": null, "expires_at": null,
"trustee_user_id": "4f7e2e76441e483982fb863ed02fe63e", "project_id":
"fac261cd98974411a9b2e977cd9ec876", "id":
"96e0e32b7c504d3f8f9b82fcc2658e5f"}}
 _http_log_response
/usr/lib/python2.7/site-packages/keystoneclient/session.py:182
2015-05-07 14:02:46.454 3942 DEBUG heat.engine.stack_lock [-] Engine
b3d3b0e1-5c47-4912-94e9-1b75159b9b10 acquired lock on stack
bb7d16c4-b73f-428d-9dbd-5b089748f374 acquire
/usr/lib/python2.7/site-packages/heat/engine/stack_lock.py:72
2015-05-07 14:02:46.456 3942 INFO oslo.messaging._drivers.impl_rabbit [-]
Connecting to AMQP server on 192.168.5.33:5672
2015-05-07 14:02:46.474 3942 DEBUG keystoneclient.auth.identity.v3 [-]
Making authentication request to http://192.168.5.33:35357/v3/auth/tokens
get_auth_ref
/usr/lib/python2.7/site-packages/keystoneclient/auth/identity/v3.py:117
2015-05-07 14:02:46.477 3942 INFO urllib3.connectionpool [-] Starting new
HTTP connection (1): 192.168.5.33
2015-05-07 14:02:46.481 3942 INFO oslo.messaging._drivers.impl_rabbit [-]
Connected to AMQP server on 192.168.5.33:5672
2015-05-07 14:02:46.666 3942 DEBUG urllib3.connectionpool [-] "POST
/v3/auth/tokens HTTP/1.1" 401 114 _make_request
/usr/lib/python2.7/site-packages/urllib3/connectionpool.py:357
2015-05-07 14:02:46.668 3942 DEBUG keystoneclient.session [-] Request
returned failure status: 401 request
/usr/lib/python2.7/site-packages/keystoneclient/session.py:345
2015-05-07 14:02:46.668 3942 DEBUG keystoneclient.v3.client [-] Authorization
failed. get_raw_token_from_identity_service
/usr/lib/python2.7/site-packages/keystoneclient/v3/client.py:267
2015-05-07 14:02:46.673 3942 DEBUG heat.engine.stack_lock [-] Engine
b3d3b0e1-5c47-4912-94e9-1b75159b9b10 released lock on stack
bb7d16c4-b73f-428d-9dbd-5b089748f374 release
/usr/lib/python2.7/site-packages/heat/engine/stack_lock.py:122
2015-05-07 14:03:33.623 3942 INFO oslo.messaging._drivers.impl_rabbit
[req-18ad2c0b-ab52-4793-99a5-255a010d83fe ] Connecting to AMQP server on
192.168.5.33:5672
2015-05-07 14:03:33.645 3942 INFO oslo.messaging._drivers.impl_rabbit
[req-18ad2c0b-ab52-4793-99a5-255a010d83fe ] Connected to AMQP server on
192.168.5.33:5672






2015-05-07 11:45 GMT+02:00 Steven Hardy <shardy at redhat.com>:

On Tue, May 05, 2015 at 01:18:17PM +0200, ICHIBA Sara wrote:
>    hello,
>
>    here is my HOT template, it's very basic:
>
>    heat_template_version: 2013-05-23
>
>    description: Simple template to deploy a single compute instance
>
>    resources:
>    A  my_instance:
>    A A A  type: OS::Nova::Server
>    A A A  properties:
>    A A A A A  image: Cirros 0.3.3
>    A A A A A  flavor: m1.small
>    A A A A A  key_name: userkey
>    A A A A A  networks:
>    A A A A A A A  - network: fdf2bb77-a828-401d-969a-736a8028950f
>
>    for the logs please find them attached.

These logs are a little confusing - it looks like you failed to create the
stack due to some validation errors, then tried again and did a stack-check
and a stack resume?

Can you please set debug = True in the [DEFAULT] section of your heat.conf,
restart heat-engine and try again please?

Also, some basic checks are:

1. When the stack is CREATE_IN_PROGRESS, what does nova list show for the
instance?

2. Is it possible to boot an instance using nova boot, using the same
arguments (image, flavor, key etc) that you specify in the heat template?

I suspect that Heat is not actually the problem here, and that some part of
Nova is either misconfigured or not running, but I can't prove that without
seeing the nova CLI output and/or the nova logs.

Steve
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.rdoproject.org/pipermail/dev/attachments/20150507/2344e013/attachment.html>


More information about the dev mailing list