[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] Re: [Xen-API] XCP 1.1b1: problems with Centos6 migration
On Sun, Jul 17, 2011 at 07:39:54AM +0400, George Shuklin wrote: > Good day. > > I was too optimistic last in my previous report: Migration works fine > within single host (localhost migration), but migration between two > hosts make things much worse. > > After few migration CentOS6 (PV, 1 VCPU, native centos kernel, lvmoiscsi > shared SR) it hangs. I do few series, usually problem appear in 3-15 > loops. > You should wait for CentOS 6.1 and re-test then. el6.1 kernel fixes multiple save/restore/migration related bugs. Or if you don't want to wait, you can also grab the kernel src.rpms from ftp.redhat.com and rebuild yourself.. -- Pasi > Symptoms: domain is present, have state 'r' in xentop (running), but > consume 0.0 seconds of time, do not response via network and console at > all (no output, no reaction to input, incuding the magic kernel keys > (Ctrl-O t, f.e.). > > After that (next attempts) only VM_FAILED_SHUTDOWN_ACKNOWLEDGMENT: [ ] > As far as I understand this is issue with pv-ops kernel (-xen kernels > seems work fine, f.e. centos 5.6 fly befween hosts without any troubles). > > ... and I don't really know how to debug this problem any deeper: How I > can see what happens in (unsuccessfully) starting domain before any > console output? > > In xensource.log (sorry for long quotation, nothing but logs below): > > RECIPIENT: > > [20110717T03:25:22.985Z| info|srv-xh6|588 inet_rpc|sm_exec > D:bbc19f84d324|xapi] Session.destroy > trackid=55deffe6654a23d2a9cc602c2716f6a6 > [20110717T03:25:22.987Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|storage_access] Executed activate succesfully on VDI > 'OpaqueRef:c34b2134-fe3e-0780-a5bf-12e4cd3edd4e'; activate refcount now: > 1 > [20110717T03:25:22.990Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] Receiver 7a1. Calling Vmops._restore_devices (domid > = 2) for non-CD devices [doing this now because we call after activate] > [20110717T03:25:23.007Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] To restore this domain we need VBDs: [ > 0/xvda:2c6f0960-2742-48ab-9935-4ddb2d5fcb28 ] > [20110717T03:25:23.027Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.add (virtpath=xvda | > physpath=/dev/sm/backend/3aa703ca-6699-36bd-35c3-ad03fc8718c9/2c6f0960-2742-48ab-9935-4ddb2d5fcb28 > > | phystype=vhd) > [20110717T03:25:23.028Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xenops] adding device > B0[/local/domain/0/backend/vbd/2/51712] > F2[/local/domain/2/device/vbd/51712] H[/xapi/2/hotplug/vbd/51712] > [20110717T03:25:23.031Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.wait_for_plug: frontend (domid=2 | > kind=vbd | devid=51712); backend (domid=0 | kind=vbd | devid=51712) > [20110717T03:25:23.031Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /xapi/2/hotplug/vbd/51712/hotplug ] with timeout 1200.000000 seconds > [20110717T03:25:23.331Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|hotplug] Synchronised ok with hotplug script: frontend > (domid=2 | kind=vbd | devid=51712); backend (domid=0 | kind=vbd | > devid=51712) > [20110717T03:25:23.334Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] set_currently_attached to true for VBD uuid > 82a3fd30-9f00-4cd7-b9f9-afdd01ba3012 > [20110717T03:25:23.357Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] Trying to attach PIF: > f710b21e-2c64-9614-bfca-6b77f376796e > [20110717T03:25:23.387Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vif.add domid=2 devid=0 > mac=a2:ef:dc:c5:06:12 carrier=true rate=none other_config=[] > extra_private_keys=[ref=OpaqueRef:cf60a63e-2823-691e-c03f-8d988cfab422; > vif-uuid=a8c2db6c-c948-51f6-5e15-1778d86b1e70; > network-uuid=b6da77c9-c514-61d9-6810-a4ae821885fc] > [20110717T03:25:23.387Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xenops] adding device > B0[/local/domain/0/backend/vif/2/0] F2[/local/domain/2/device/vif/0] > H[/xapi/2/hotplug/vif/0] > [20110717T03:25:23.391Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.wait_for_plug: frontend (domid=2 | > kind=vif | devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:23.391Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /xapi/2/hotplug/vif/0/hotplug ] with timeout 1200.000000 seconds > [20110717T03:25:23.538Z| info|srv-xh6|31 heartbeat|Heartbeat > D:d80b0889665c|http] stunnel pid: 10354 (cached = false) connected to > 10.1.3.5:443 > [20110717T03:25:23.794Z|debug|srv-xh6|236 xal_listen||event] VM (domid: > 2) device_event = HotplugChanged on 0 {""->online} > [20110717T03:25:23.795Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|hotplug] Synchronised ok with hotplug script: frontend > (domid=2 | kind=vif | devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:23.798Z|debug|srv-xh6|236 xal_listen|VM (domid: 2) > device_event = HotplugChanged on 0 {""->online} D:06ba0c69ccd3|event] > Adding Resync.vif to queue > [20110717T03:25:23.798Z|debug|srv-xh6|236 xal_listen|VM (domid: 2) > device_event = HotplugChanged on 0 {""->online} > D:06ba0c69ccd3|locking_helpers] push(per-VM queue, HotplugChanged(vif, > 0) domid: 2); queue = [ HotplugChanged(vif, 0) domid: 2 ](1) > [20110717T03:25:23.800Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] setting current number of vcpus to 1 (out of 1) > [20110717T03:25:23.803Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] Receiver 7b. unpausing domain > [20110717T03:25:23.812Z|debug|srv-xh6|236 xal_listen||event] VM (domid: > 2) device_event = device thread {51712} pid=10424 > [20110717T03:25:23.813Z| info|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|http] stunnel pid: 9963 (cached = true) connected to > 10.1.3.5:443 > [20110717T03:25:23.835Z|debug|srv-xh6|236 xal_listen|VM (domid: 2) > device_event = device thread {51712} pid=10424 D:8489d7fe9933|event] > Adding Vbdops.set_vbd_qos to queue > [20110717T03:25:23.835Z|debug|srv-xh6|236 xal_listen|VM (domid: 2) > device_event = device thread {51712} pid=10424 > D:8489d7fe9933|locking_helpers] push(per-VM queue, DevThread(51712, > 10424) domid 2); queue = [ HotplugChanged(vif, 0) domid: 2; > DevThread(51712, 10424) domid 2 ](2) > [20110717T03:25:23.836Z|debug|srv-xh6|236 xal_listen||event] VM (domid: > 2) device_event = device thread {51760} pid=10425 > [20110717T03:25:23.839Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] Receiver 8. signalling sender that we're done > [20110717T03:25:23.839Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|xapi] Receiver 9a Success > [20110717T03:25:23.839Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|locking_helpers] pop(per-VM queue) = HotplugChanged(vif, > 0) domid: 2 > [20110717T03:25:24.009Z|debug|srv-xh6|236 xal_listen|VM (domid: 2) > device_event = device thread {51760} pid=10425 D:7ebcccec7c2c|event] > Adding Vbdops.set_vbd_qos to queue > [20110717T03:25:24.009Z|debug|srv-xh6|236 xal_listen|VM (domid: 2) > device_event = device thread {51760} pid=10425 > D:7ebcccec7c2c|thread_queue] push(vm_lifecycle_op, DevThread(51760, > 10425) domid 2): queue = [ DevThread(51760, 10425) domid 2 ](1) > [20110717T03:25:24.009Z|debug|srv-xh6|261||thread_queue] > pop(vm_lifecycle_op) = DevThread(51760, 10425) domid 2 > [20110717T03:25:24.010Z|debug|srv-xh6|261||event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: grabbing lock to > perform: DevThread(51760, 10425) domid 2 > [20110717T03:25:24.011Z|debug|srv-xh6|588 inet_rpc|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > HotplugChanged(vif, 0) domid: 2 D:96e513f93cbe|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: about to perform: > HotplugChanged(vif, 0) domid: 2 > [20110717T03:25:24.014Z|debug|srv-xh6|588 inet_rpc|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > HotplugChanged(vif, 0) domid: 2 D:96e513f93cbe|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf (domid: 2) Resync.vif > OpaqueRef:cf60a63e-2823-691e-c03f-8d988cfab422 > [20110717T03:25:24.021Z|debug|srv-xh6|588 inet_rpc|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > HotplugChanged(vif, 0) domid: 2 D:96e513f93cbe|event] VIF > OpaqueRef:cf60a63e-2823-691e-c03f-8d988cfab422: is_attached = true; > online = true > [20110717T03:25:24.021Z|debug|srv-xh6|588 inet_rpc|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > HotplugChanged(vif, 0) domid: 2 D:96e513f93cbe|event] > VIF.currently_attached field is in sync > [20110717T03:25:24.021Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|locking_helpers] pop(per-VM queue) = DevThread(51712, > 10424) domid 2 > [20110717T03:25:24.023Z|debug|srv-xh6|588 inet_rpc|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > DevThread(51712, 10424) domid 2 D:a38442a2eb2e|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: about to perform: > DevThread(51712, 10424) domid 2 > [20110717T03:25:24.028Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|locking_helpers] Released lock on VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf with token 10 > [20110717T03:25:24.028Z|debug|srv-xh6|261||locking_helpers] Acquired > lock on VM OpaqueRef:dd860476-057f-2b22-6120-7668828720cf with token 11 > [20110717T03:25:24.036Z|debug|srv-xh6|588 inet_rpc|VM.pool_migrate > R:524e127c00dc|taskhelper] forwarded task destroyed > [20110717T03:25:24.038Z|debug|srv-xh6|261|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > DevThread(51760, 10425) domid 2 D:973a5d8631fb|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: about to perform: > DevThread(51760, 10425) domid 2 > [20110717T03:25:24.042Z|debug|srv-xh6|261||locking_helpers] Released > lock on VM OpaqueRef:dd860476-057f-2b22-6120-7668828720cf with token 11 > > > > SENDER: > > [20110717T03:25:01.740Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-d5cf70da-0e07-f363-3579-dd11117ae164 > D:27f4c75cf51a|dispatcher] Unknown rpc > "unknown-message-d5cf70da-0e07-f363-3579-dd11117ae164" > [20110717T03:25:01.743Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:5337c58feb87 created by task > D:af663a5fb7f0 > [20110717T03:25:01.749Z|debug|srv-xh5|861 > inet-RPC|dispatch:unknown-message-0eb150ce-acbd-f892-3d22-69521767303e > D:ab180dd3e254|dispatcher] Unknown rpc > "unknown-message-0eb150ce-acbd-f892-3d22-69521767303e" > [20110717T03:25:01.751Z|debug|srv-xh5|861 inet-RPC||cli] xe vm-migrate > uuid=a0e508db-fb6d-4ad9-4d68-9e6d20a0d2c5 host=srv-xh6 username=root > password=null > [20110717T03:25:01.989Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|audit] VM.pool_migrate: VM = > 'a0e508db-fb6d-4ad9-4d68-9e6d20a0d2c5 (1499_5393)'; host = > '0923dc25-9db9-43ad-b4e6-df8c10aa7e94 (srv-xh6)' > [20110717T03:25:01.993Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Raised at db_cache_types.ml:75.27-76 -> > db_cache_types.ml:118.2-40 -> pervasiveext.ml:22.2-9 > [20110717T03:25:01.994Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] host srv-xh6; available_memory = 50067652608; > memory_required = 1084227584 > [20110717T03:25:01.994Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi_ha_vm_failover] > assert_configuration_change_preserves_ha_plan c = configuration_change = > { old_vms_leaving = [ ]; new_vms_arriving = [ 4f82279c (srv-xh6) > dd860476 (1499_5393) ]; hosts_to_disable = [ ]; num_failures = no > change; new_vms = [ ] } > [20110717T03:25:01.999Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|thread_queue] push(long_running_op, VM.pool_migrate > VM.pool_migrate R:524e127c00dc): queue = [ VM.pool_migrate > VM.pool_migrate R:524e127c00dc ](1) > [20110717T03:25:02.000Z|debug|srv-xh5|872||thread_queue] > pop(long_running_op) = VM.pool_migrate VM.pool_migrate R:524e127c00dc > [20110717T03:25:02.000Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|locking_helpers] Acquired lock on VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf with token 6 > [20110717T03:25:02.001Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] VM is running; attempting migration > [20110717T03:25:02.001Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender doing a dead migration > [20110717T03:25:02.002Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 1. Logging into remote server > [20110717T03:25:02.004Z|debug|srv-xh5|874 unix-RPC||dummytaskhelper] > task dispatch:session.slave_login D:bcc430648842 created by task > R:524e127c00dc > [20110717T03:25:02.009Z| info|srv-xh5|874 unix-RPC|session.slave_login > D:9067ddf4f38a|xapi] Session.create > trackid=fe7e3d8d48fcd164b2a29b75c4a2147b pool=true uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:02.012Z|debug|srv-xh5|875 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:1d7451396ce3 created by task > D:9067ddf4f38a > [20110717T03:25:02.015Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|memory_control] rebalance_memory > [20110717T03:25:02.015Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenstore-rpc] Checking pid 6807 > [20110717T03:25:02.016Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenstore-rpc] Written request using id: > 4fb7bfff-7ae1-8b8b-e7fa-ab47552facfe > [20110717T03:25:02.016Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /squeezed/rpc/response/balance-memory/4fb7bfff-7ae1-8b8b-e7fa-ab47552facfe > ] with timeout 1200.000000 seconds > [20110717T03:25:02.020Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 2. Transmitting an HTTP CONNECT to URI: > /migrate?ref=OpaqueRef:dd860476-057f-2b22-6120-7668828720cf&memory_required_kib=1048576 > [20110717T03:25:02.035Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-1b727ec6-2845-b957-ce71-04ab690be24d > D:c8d559522194|dispatcher] Unknown rpc > "unknown-message-1b727ec6-2845-b957-ce71-04ab690be24d" > [20110717T03:25:02.038Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:session.slave_login D:8b953fb837f0 created by task > D:eaf893862dbe > [20110717T03:25:02.041Z| info|srv-xh5|769 inet-RPC|session.slave_login > D:d266425e2834|xapi] Session.create > trackid=a4a82daebd2b29a9dad6114e986271a3 pool=true uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:02.043Z|debug|srv-xh5|877 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:85916d34d95b created by task > D:d266425e2834 > [20110717T03:25:02.048Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-16044a28-af26-93a6-ce4e-a92d1c8b6a64 > D:81ab25a8e098|dispatcher] Unknown rpc > "unknown-message-16044a28-af26-93a6-ce4e-a92d1c8b6a64" > [20110717T03:25:02.051Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:pool.audit_log_append D:234e98efa72f created by task > D:eaf893862dbe > [20110717T03:25:02.054Z| info|srv-xh5|769 > inet-RPC|dispatch:pool.audit_log_append D:234e98efa72f|taskhelper] task > pool.audit_log_append R:108324b9a15a > (uuid:a0a66d00-c651-ab85-d22b-9726af8faf2b) created > (trackid=a4a82daebd2b29a9dad6114e986271a3) by task D:eaf893862dbe > [20110717T03:25:02.058Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-9527f44a-683f-a50d-1227-45a9e49aa111 > D:c17e4dd3c5f5|dispatcher] Unknown rpc > "unknown-message-9527f44a-683f-a50d-1227-45a9e49aa111" > [20110717T03:25:02.061Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:session.logout D:f7a3ba820dce created by task > D:eaf893862dbe > [20110717T03:25:02.063Z| info|srv-xh5|769 inet-RPC|session.logout > D:026cf0ac142a|xapi] Session.destroy > trackid=a4a82daebd2b29a9dad6114e986271a3 > [20110717T03:25:02.118Z|debug|srv-xh5|506 inet-RPC||xapi] Raised at > db_cache_types.ml:75.27-76 -> db_cache_types.ml:118.2-40 -> > pervasiveext.ml:22.2-9 > [20110717T03:25:02.132Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-f32cf413-5ead-09cc-ebe1-c134931037db > D:f541b82fb842|dispatcher] Unknown rpc > "unknown-message-f32cf413-5ead-09cc-ebe1-c134931037db" > [20110717T03:25:02.135Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:d0a51881c600 created by task > D:7873d345ba5c > [20110717T03:25:02.217Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-2ecdc50f-5f61-8ae8-e00d-0b0c96dee087 > D:db3950aa4a50|dispatcher] Unknown rpc > "unknown-message-2ecdc50f-5f61-8ae8-e00d-0b0c96dee087" > [20110717T03:25:02.220Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:f2196d9b65be created by task > R:524e127c00dc > [20110717T03:25:02.228Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-8c594cdc-028c-c846-071f-312931eab03e > D:9148389a81ef|dispatcher] Unknown rpc > "unknown-message-8c594cdc-028c-c846-071f-312931eab03e" > [20110717T03:25:02.232Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:28863224bcf7 created by task > R:524e127c00dc > [20110717T03:25:02.238Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-ace421d9-05e7-56ae-a22f-7205beaa0e4a > D:2e4608f366d6|dispatcher] Unknown rpc > "unknown-message-ace421d9-05e7-56ae-a22f-7205beaa0e4a" > [20110717T03:25:02.241Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:da5c53e47e69 created by task > R:524e127c00dc > [20110717T03:25:02.255Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-7ead9cf4-6d0f-b8c2-311b-7a8269ee1cfd > D:b3ca1554cff8|dispatcher] Unknown rpc > "unknown-message-7ead9cf4-6d0f-b8c2-311b-7a8269ee1cfd" > [20110717T03:25:02.258Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:9ae1d8a347e5 created by task > R:524e127c00dc > [20110717T03:25:02.265Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-6281ddf3-7779-7ef5-dcce-4bf19d36caff > D:be361fa5c4e6|dispatcher] Unknown rpc > "unknown-message-6281ddf3-7779-7ef5-dcce-4bf19d36caff" > [20110717T03:25:02.269Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:322103060db5 created by task > R:524e127c00dc > [20110717T03:25:02.313Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-947529a2-837a-fae5-5e62-99286ca60145 > D:cb4404e7f87f|dispatcher] Unknown rpc > "unknown-message-947529a2-837a-fae5-5e62-99286ca60145" > [20110717T03:25:02.316Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:b9e9aa63b911 created by task > R:524e127c00dc > [20110717T03:25:02.329Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-8bcea59c-ffbd-34b4-43eb-b0b5fd7fc820 > D:1b825c719144|dispatcher] Unknown rpc > "unknown-message-8bcea59c-ffbd-34b4-43eb-b0b5fd7fc820" > [20110717T03:25:02.333Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:173e14c71a41 created by task > R:524e127c00dc > [20110717T03:25:02.340Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-1fb7f3da-15c5-e3ec-b804-85e90312e7b1 > D:eec2cca57fcb|dispatcher] Unknown rpc > "unknown-message-1fb7f3da-15c5-e3ec-b804-85e90312e7b1" > [20110717T03:25:02.343Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:7313dbb4784b created by task > R:524e127c00dc > [20110717T03:25:02.457Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-5e026680-cbf4-4658-9e26-5bbf129cfe77 > D:915f97e05f7a|dispatcher] Unknown rpc > "unknown-message-5e026680-cbf4-4658-9e26-5bbf129cfe77" > [20110717T03:25:02.460Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:8e9499647984 created by task > R:524e127c00dc > [20110717T03:25:02.467Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-7fbc503f-a46e-84d5-0077-52243d919e8f > D:08dc80f181e2|dispatcher] Unknown rpc > "unknown-message-7fbc503f-a46e-84d5-0077-52243d919e8f" > [20110717T03:25:02.470Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:e114d1d174c6 created by task > R:524e127c00dc > [20110717T03:25:02.517Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-53b2daa6-19c2-3a95-3d8c-0713616915ee > D:7189557e47fe|dispatcher] Unknown rpc > "unknown-message-53b2daa6-19c2-3a95-3d8c-0713616915ee" > [20110717T03:25:02.520Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:00d59f14e36e created by task > R:524e127c00dc > [20110717T03:25:02.554Z|debug|srv-xh5|506 inet-RPC||xapi] Raised at > db_cache_types.ml:75.27-76 -> db_cache_types.ml:118.2-40 -> > pervasiveext.ml:22.2-9 > [20110717T03:25:02.566Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-5e8b73e7-21c7-8d28-c388-4946ae443687 > D:ee3e651e1c02|dispatcher] Unknown rpc > "unknown-message-5e8b73e7-21c7-8d28-c388-4946ae443687" > [20110717T03:25:02.569Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:92ce25fd9e93 created by task > D:3cc1fbecd2cf > [20110717T03:25:02.649Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-90a874a4-1763-f5d3-f38a-14ccb238bd8c > D:1686de56f91f|dispatcher] Unknown rpc > "unknown-message-90a874a4-1763-f5d3-f38a-14ccb238bd8c" > [20110717T03:25:02.652Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:82af869b00d0 created by task > R:524e127c00dc > [20110717T03:25:02.659Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-02c62b3f-990b-b6be-71e7-2b47e5b0e373 > D:cb8f1708f741|dispatcher] Unknown rpc > "unknown-message-02c62b3f-990b-b6be-71e7-2b47e5b0e373" > [20110717T03:25:02.662Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:63d5d25b5ecd created by task > R:524e127c00dc > [20110717T03:25:02.674Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-34dfd6f8-54cb-cf3c-9f3d-cc032e8189c6 > D:fb5ea6cf3151|dispatcher] Unknown rpc > "unknown-message-34dfd6f8-54cb-cf3c-9f3d-cc032e8189c6" > [20110717T03:25:02.677Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:45244eaabb53 created by task > R:524e127c00dc > [20110717T03:25:03.096Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 4. calling Domain.suspend (domid = 1; hvm = > false) > [20110717T03:25:03.097Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Domain.suspend domid=1 > [20110717T03:25:03.097Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] connect: args = [ -fd > 322504e6-b5e0-c620-3b8f-c94aae892555 -mode save -domid 1 -fork true ] > [20110717T03:25:03.105Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Blocking for suspend notification from xenguest > [20110717T03:25:03.109Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] got suspend notification from xenguesthelper > [20110717T03:25:03.109Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Requesting shutdown of domain 1 > [20110717T03:25:03.110Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Waiting for PV domain 1 to acknowledge shutdown > request > [20110717T03:25:03.111Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /local/domain/1/control/shutdown ] with timeout 10.000000 seconds > [20110717T03:25:03.111Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Domain acknowledged shutdown request > [20110717T03:25:03.111Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] MTC: calling xal.wait_release timeout=1.000000 > [20110717T03:25:04.114Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] MTC: calling xal.wait_release timeout=1.000000 > [20110717T03:25:04.114Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] done\n > [20110717T03:25:04.121Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] Had 0 unexplained entries in p2m > table\nSaving memory pages: iter 1 0% > [20110717T03:25:17.066Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 5% > [20110717T03:25:17.415Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 10% > [20110717T03:25:17.594Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 15% > [20110717T03:25:17.773Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 20% > [20110717T03:25:17.941Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 25% > [20110717T03:25:18.123Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 30% > [20110717T03:25:18.310Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 35% > [20110717T03:25:18.488Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 40% > [20110717T03:25:18.670Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 45% > [20110717T03:25:18.834Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 50% > [20110717T03:25:19.018Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 55% > [20110717T03:25:19.201Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 60% > [20110717T03:25:19.384Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 65% > [20110717T03:25:19.567Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 70% > [20110717T03:25:19.735Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 75% > [20110717T03:25:19.922Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 80% > [20110717T03:25:20.101Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 85% > [20110717T03:25:20.290Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 90% > [20110717T03:25:20.474Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \b\b\b\b 95% > [20110717T03:25:20.640Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] \r 1: sent 262144, skipped 0, > [20110717T03:25:20.640Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] delta 16520ms, dom0 11%, target 0%, sent > 519Mb/s, dirtied 730Mb/s -1217028608 pages\n > [20110717T03:25:20.640Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] Total pages sent= 262144 (1.00x)\n > [20110717T03:25:20.640Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] (of which 0 were fixups)\n > [20110717T03:25:20.640Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] All memory is saved\n > [20110717T03:25:20.652Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenguesthelper] Save exit rc=0\n > [20110717T03:25:20.653Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] migration_progress = 1.00 > [20110717T03:25:20.653Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Final result: > [20110717T03:25:20.653Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Suspend for domid 1 finished > [20110717T03:25:20.654Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 5. waiting for blocks to flush > [20110717T03:25:20.654Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.request_shutdown frontend (domid=1 | > kind=vbd | devid=51712); backend (domid=0 | kind=vbd | devid=51712) force > [20110717T03:25:20.654Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-write > /local/domain/0/backend/vbd/1/51712/shutdown-request = force > [20110717T03:25:20.654Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /local/domain/0/backend/vbd/1/51712/shutdown-done ] with timeout > 1200.000000 seconds > [20110717T03:25:20.656Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] VBD backends have flushed > [20110717T03:25:20.656Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 5a. Deactivating VDIs > [20110717T03:25:20.656Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|storage_access] Considering execute deactivate on VDI > 'OpaqueRef:c34b2134-fe3e-0780-a5bf-12e4cd3edd4e'; activate refcount now: > 1 > [20110717T03:25:20.656Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|sm] SM lvmoiscsi vdi_deactivate > sr=OpaqueRef:a1077822-9453-de43-0efa-ecb0fe508f59 > vdi=OpaqueRef:c34b2134-fe3e-0780-a5bf-12e4cd3edd4e > [20110717T03:25:20.657Z|debug|srv-xh5|861 inet-RPC|sm_exec > D:30b89d60d8c5|xapi] Raised at db_cache_types.ml:75.27-76 -> > db_cache_types.ml:118.2-40 -> pervasiveext.ml:22.2-9 > [20110717T03:25:20.659Z| info|srv-xh5|861 inet-RPC|sm_exec > D:a422fab8f94a|xapi] Session.create > trackid=8a1c909d031af45eef1af1c16b83bfa4 pool=false uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > > [20110717T03:25:20.660Z|debug|srv-xh5|326 xal_listen||event] VM (domid: > 1) device_event = device shutdown {vbd,51712} > [20110717T03:25:20.661Z|debug|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = device shutdown {vbd,51712} D:8fcb1674e914|event] Adding > Resync.vbd to queue > [20110717T03:25:20.661Z|debug|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = device shutdown {vbd,51712} > D:8fcb1674e914|locking_helpers] push(per-VM queue, DevShutdownDone(vbd, > 51712) domid: 1); queue = [ DevShutdownDone(vbd, 51712) domid: 1 ](1) > [20110717T03:25:20.662Z|debug|srv-xh5|878 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:6416926c95c6 created by task > D:a422fab8f94a > [20110717T03:25:20.739Z|debug|srv-xh5|879 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:e4492518324b created by task > R:524e127c00dc > [20110717T03:25:20.745Z|debug|srv-xh5|880 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:aa523fad6d08 created by task > R:524e127c00dc > [20110717T03:25:20.749Z|debug|srv-xh5|881 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:e342bdc29be7 created by task > R:524e127c00dc > [20110717T03:25:20.758Z|debug|srv-xh5|882 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:862b4dd2b17e created by task > R:524e127c00dc > [20110717T03:25:20.763Z|debug|srv-xh5|883 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:62e0206f0a8d created by task > R:524e127c00dc > [20110717T03:25:20.805Z|debug|srv-xh5|884 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:018837a3a115 created by task > R:524e127c00dc > [20110717T03:25:20.814Z|debug|srv-xh5|885 unix-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:7b621f89b5eb created by task > R:524e127c00dc > [20110717T03:25:20.819Z|debug|srv-xh5|886 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:4ee5086a1b4a created by task > R:524e127c00dc > [20110717T03:25:20.968Z|debug|srv-xh5|887 unix-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:5dc86257d8aa created by task > R:524e127c00dc > [20110717T03:25:20.973Z|debug|srv-xh5|888 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:2b0f3370d6b8 created by task > R:524e127c00dc > [20110717T03:25:21.053Z|debug|srv-xh5|889 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:1365c96ff65b created by task > R:524e127c00dc > [20110717T03:25:21.059Z|debug|srv-xh5|890 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_by_uuid D:3a06e3935dae created by task > R:524e127c00dc > [20110717T03:25:21.063Z|debug|srv-xh5|891 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:1551b544d4d7 created by task > R:524e127c00dc > [20110717T03:25:21.146Z| info|srv-xh5|892 > unix-RPC|session.login_with_password D:fc6e0fd55e5c|xapi] Session.create > trackid=c7254b4ee213f904468db9d99aad81e3 pool=false uname=root > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:21.148Z|debug|srv-xh5|893 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:3e2ef7bc707b created by task > D:fc6e0fd55e5c > [20110717T03:25:21.176Z| info|srv-xh5|899 > unix-RPC|dispatch:VDI.remove_from_sm_config D:0739bca94433|api_effect] > VDI.remove_from_sm_config > [20110717T03:25:21.181Z| info|srv-xh5|900 > unix-RPC|dispatch:VDI.remove_from_sm_config D:c291a6652eb0|api_effect] > VDI.remove_from_sm_config > [20110717T03:25:21.186Z| info|srv-xh5|901 unix-RPC|session.logout > D:95729441ad3f|xapi] Session.destroy > trackid=c7254b4ee213f904468db9d99aad81e3 > [20110717T03:25:21.189Z|debug|srv-xh5|902 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_by_uuid D:82977ea7716e created by task > R:524e127c00dc > [20110717T03:25:21.194Z|debug|srv-xh5|903 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_SR D:ecb0b5029946 created by task R:524e127c00dc > [20110717T03:25:21.199Z|debug|srv-xh5|904 unix-RPC||dummytaskhelper] > task dispatch:SR.get_uuid D:19633a79a896 created by task R:524e127c00dc > [20110717T03:25:21.203Z|debug|srv-xh5|905 unix-RPC||dummytaskhelper] > task dispatch:SR.get_by_uuid D:8e2bca902494 created by task > R:524e127c00dc > [20110717T03:25:21.208Z|debug|srv-xh5|906 unix-RPC||dummytaskhelper] > task dispatch:SR.get_type D:febefab168b5 created by task R:524e127c00dc > [20110717T03:25:21.213Z|debug|srv-xh5|907 unix-RPC||dummytaskhelper] > task dispatch:SM.get_all_records_where D:7af199168cea created by task > R:524e127c00dc > [20110717T03:25:21.219Z|debug|srv-xh5|908 unix-RPC||dummytaskhelper] > task dispatch:SM.get_driver_filename D:13305f374ecc created by task > R:524e127c00dc > [20110717T03:25:21.224Z|debug|srv-xh5|909 unix-RPC||dummytaskhelper] > task dispatch:VM.get_all_records_where D:8cffc0c4d82c created by task > R:524e127c00dc > [20110717T03:25:21.233Z|debug|srv-xh5|910 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records_where D:200b58202c7e created by task > R:524e127c00dc > [20110717T03:25:21.238Z|debug|srv-xh5|911 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_device_config D:6c10b06d167f created by task > R:524e127c00dc > [20110717T03:25:21.860Z|debug|srv-xh5|924 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_by_uuid D:5c20f2c1bd1d created by task > R:524e127c00dc > [20110717T03:25:21.865Z|debug|srv-xh5|925 unix-RPC||dummytaskhelper] > task dispatch:host.get_by_uuid D:40b691ff7cc3 created by task > R:524e127c00dc > [20110717T03:25:21.869Z|debug|srv-xh5|926 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:d58b855d3328 created by task > R:524e127c00dc > [20110717T03:25:21.874Z|debug|srv-xh5|927 unix-RPC||dummytaskhelper] > task dispatch:VDI.remove_from_sm_config D:4894e9f33205 created by task > R:524e127c00dc > [20110717T03:25:21.875Z| info|srv-xh5|927 > unix-RPC|dispatch:VDI.remove_from_sm_config D:4894e9f33205|api_effect] > VDI.remove_from_sm_config > [20110717T03:25:21.886Z| info|srv-xh5|861 inet-RPC|sm_exec > D:a422fab8f94a|xapi] Session.destroy > trackid=8a1c909d031af45eef1af1c16b83bfa4 > [20110717T03:25:21.887Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|storage_access] Executed deactivate in backend succesfully > [20110717T03:25:21.887Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 6. signalling remote to unpause > [20110717T03:25:21.887Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 6a. Detaching VDIs > [20110717T03:25:21.888Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|sm] SM lvmoiscsi vdi_detach > sr=OpaqueRef:a1077822-9453-de43-0efa-ecb0fe508f59 > vdi=OpaqueRef:c34b2134-fe3e-0780-a5bf-12e4cd3edd4e > [20110717T03:25:21.889Z|debug|srv-xh5|861 inet-RPC|sm_exec > D:69c3c89d6b89|xapi] Raised at db_cache_types.ml:75.27-76 -> > db_cache_types.ml:118.2-40 -> pervasiveext.ml:22.2-9 > [20110717T03:25:21.891Z| info|srv-xh5|861 inet-RPC|sm_exec > D:d42fb7605e99|xapi] Session.create > trackid=2d11838049b96a0d313ba1a9ac738e13 pool=false uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:21.896Z|debug|srv-xh5|928 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:8c60fd0b9b5f created by task > D:d42fb7605e99 > [20110717T03:25:21.909Z|debug|srv-xh5|506 inet-RPC||xapi] Raised at > db_cache_types.ml:75.27-76 -> db_cache_types.ml:118.2-40 -> > pervasiveext.ml:22.2-9 > [20110717T03:25:21.922Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-0cb119d4-35f0-1cf5-16b0-aae0867cab6d > D:8c37aa00f8d1|dispatcher] Unknown rpc > "unknown-message-0cb119d4-35f0-1cf5-16b0-aae0867cab6d" > [20110717T03:25:21.925Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:75a3345c8b11 created by task > D:bbc19f84d324 > [20110717T03:25:21.978Z|debug|srv-xh5|929 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:b68b9332216b created by task > R:524e127c00dc > [20110717T03:25:21.983Z|debug|srv-xh5|930 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:5564ca8c5e13 created by task > R:524e127c00dc > [20110717T03:25:21.988Z|debug|srv-xh5|931 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:144dc57ac00c created by task > R:524e127c00dc > [20110717T03:25:21.996Z|debug|srv-xh5|932 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:706f2466d7ed created by task > R:524e127c00dc > [20110717T03:25:22.001Z|debug|srv-xh5|933 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:1b25419f79f4 created by task > R:524e127c00dc > [20110717T03:25:22.008Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-7380fac6-3be3-0383-a351-ca2a9e18bf06 > D:3b360231c009|dispatcher] Unknown rpc > "unknown-message-7380fac6-3be3-0383-a351-ca2a9e18bf06" > [20110717T03:25:22.011Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:67aa8722cce1 created by task > R:524e127c00dc > [20110717T03:25:22.018Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-cd5a58d6-fd21-c7de-4030-6c7638be5b83 > D:b16de4eb8789|dispatcher] Unknown rpc > "unknown-message-cd5a58d6-fd21-c7de-4030-6c7638be5b83" > [20110717T03:25:22.021Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:5b97b5e565c3 created by task > R:524e127c00dc > [20110717T03:25:22.028Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-a30634f2-8bff-21d9-36da-c5b46415d361 > D:af0d9427b045|dispatcher] Unknown rpc > "unknown-message-a30634f2-8bff-21d9-36da-c5b46415d361" > [20110717T03:25:22.031Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:201c11015bef created by task > R:524e127c00dc > [20110717T03:25:22.044Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-744cfaaf-903f-0817-d1e4-17262d7c6ba5 > D:dd90d844bfb8|dispatcher] Unknown rpc > "unknown-message-744cfaaf-903f-0817-d1e4-17262d7c6ba5" > [20110717T03:25:22.047Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:4d4a0a0c29ca created by task > R:524e127c00dc > [20110717T03:25:22.050Z|debug|srv-xh5|934 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:624c666c6f30 created by task > R:524e127c00dc > [20110717T03:25:22.055Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-f9ac5050-d729-4bc0-7ec2-5eb3a9a16dcd > D:10f75014e727|dispatcher] Unknown rpc > "unknown-message-f9ac5050-d729-4bc0-7ec2-5eb3a9a16dcd" > [20110717T03:25:22.058Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:1e139dff25df created by task > R:524e127c00dc > [20110717T03:25:22.059Z|debug|srv-xh5|935 unix-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:acc65ba88f3b created by task > R:524e127c00dc > [20110717T03:25:22.064Z|debug|srv-xh5|936 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:1f7a34b66475 created by task > R:524e127c00dc > [20110717T03:25:22.103Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-ddf14d28-8825-d57e-decf-f878807c7086 > D:a5006b262f0c|dispatcher] Unknown rpc > "unknown-message-ddf14d28-8825-d57e-decf-f878807c7086" > [20110717T03:25:22.106Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:8757edab0ee9 created by task > R:524e127c00dc > [20110717T03:25:22.119Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-ed21bedc-7fe6-fd92-dcd1-d8bf6389ac1e > D:8f81d874312a|dispatcher] Unknown rpc > "unknown-message-ed21bedc-7fe6-fd92-dcd1-d8bf6389ac1e" > [20110717T03:25:22.122Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:ed9284c21981 created by task > R:524e127c00dc > [20110717T03:25:22.128Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-48a21f3b-e4ac-725b-448d-7718c98ba70b > D:b48119e5e24a|dispatcher] Unknown rpc > "unknown-message-48a21f3b-e4ac-725b-448d-7718c98ba70b" > [20110717T03:25:22.131Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:7d421515431f created by task > R:524e127c00dc > [20110717T03:25:22.147Z|debug|srv-xh5|937 unix-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:43d6228f2840 created by task > R:524e127c00dc > [20110717T03:25:22.151Z|debug|srv-xh5|938 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:c76ff4bac18d created by task > R:524e127c00dc > [20110717T03:25:22.231Z|debug|srv-xh5|939 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:f57b2562de78 created by task > R:524e127c00dc > [20110717T03:25:22.243Z| info|srv-xh5|861 inet-RPC|sm_exec > D:d42fb7605e99|xapi] Session.destroy > trackid=2d11838049b96a0d313ba1a9ac738e13 > [20110717T03:25:22.244Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|storage_access] Executed detach succesfully on VDI > '2c6f0960-2742-48ab-9935-4ddb2d5fcb28'; attach refcount now: 0 > [20110717T03:25:22.244Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|monitor_rrds] Sending RRD for VM > uuid=a0e508db-fb6d-4ad9-4d68-9e6d20a0d2c5 to remote host for migrate > [20110717T03:25:22.244Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|monitor_rrds] Sending RRD for object > uuid=a0e508db-fb6d-4ad9-4d68-9e6d20a0d2c5 archiving=false to address: > 10.1.3.6 > [20110717T03:25:22.247Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|monitor_rrds] Sending rrd to 10.1.3.6 > [20110717T03:25:22.251Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-cfee954c-c29c-d757-15d0-c417c69ad27f > D:d640eaea2606|dispatcher] Unknown rpc > "unknown-message-cfee954c-c29c-d757-15d0-c417c69ad27f" > [20110717T03:25:22.256Z| info|srv-xh5|769 inet-RPC|session.slave_login > D:2dd78dfe096d|xapi] Session.create > trackid=daa4d955ccd60ddf8c85db270114ebc1 pool=true uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:22.258Z|debug|srv-xh5|940 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:5f84c9640436 created by task > D:2dd78dfe096d > [20110717T03:25:22.282Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|monitor_rrds] Sent > [20110717T03:25:22.286Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-d7eeef81-adea-76b2-5bcd-c7809df26a53 > D:9ce50fca6acb|dispatcher] Unknown rpc > "unknown-message-d7eeef81-adea-76b2-5bcd-c7809df26a53" > [20110717T03:25:22.286Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 7. waiting for all-clear from remote > [20110717T03:25:22.303Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-41e746f3-f874-4bed-891e-3e85c23802cd > D:c6808dbcafc7|dispatcher] Unknown rpc > "unknown-message-41e746f3-f874-4bed-891e-3e85c23802cd" > [20110717T03:25:22.303Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:f58fd6b46485 created by task > R:524e127c00dc > [20110717T03:25:22.307Z| info|srv-xh5|769 inet-RPC|session.logout > D:1e7eb98da29a|xapi] Session.destroy > trackid=daa4d955ccd60ddf8c85db270114ebc1 > [20110717T03:25:22.310Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-efff5417-af73-cfd5-d9a3-125179744ff7 > D:f57cd3b3f8c8|dispatcher] Unknown rpc > "unknown-message-efff5417-af73-cfd5-d9a3-125179744ff7" > [20110717T03:25:22.313Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:37efa8c4e1df created by task > R:524e127c00dc > [20110717T03:25:22.361Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-ab65508c-a174-9467-4f33-42ba51540ba0 > D:5d8e466229d4|dispatcher] Unknown rpc > "unknown-message-ab65508c-a174-9467-4f33-42ba51540ba0" > [20110717T03:25:22.364Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:62c3b9bcbe83 created by task > R:524e127c00dc > [20110717T03:25:22.372Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-72015697-be9e-3b4c-8eb2-b419d6a8947b > D:ca521ca62002|dispatcher] Unknown rpc > "unknown-message-72015697-be9e-3b4c-8eb2-b419d6a8947b" > [20110717T03:25:22.375Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_by_uuid D:08e7832abcc7 created by task > R:524e127c00dc > [20110717T03:25:22.382Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-cfbae17d-bab7-9935-0b36-c97337a8622f > D:e79db0a98e7e|dispatcher] Unknown rpc > "unknown-message-cfbae17d-bab7-9935-0b36-c97337a8622f" > [20110717T03:25:22.385Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:host.get_by_uuid D:a750b6ed78dd created by task > R:524e127c00dc > [20110717T03:25:22.392Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-b2a4aaf8-3449-83ef-6b5f-517d777abc4f > D:6c67599f57fc|dispatcher] Unknown rpc > "unknown-message-b2a4aaf8-3449-83ef-6b5f-517d777abc4f" > [20110717T03:25:22.395Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:9f34b4a304dd created by task > R:524e127c00dc > [20110717T03:25:22.402Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-163bc2b9-1311-dd60-c7d1-259d254e267d > D:ea6b50defeb6|dispatcher] Unknown rpc > "unknown-message-163bc2b9-1311-dd60-c7d1-259d254e267d" > [20110717T03:25:22.405Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:VDI.add_to_sm_config D:666c34cd9dc0 created by task > R:524e127c00dc > [20110717T03:25:22.405Z| info|srv-xh5|769 > inet-RPC|dispatch:VDI.add_to_sm_config D:666c34cd9dc0|api_effect] > VDI.add_to_sm_config > [20110717T03:25:22.411Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-e2791ba2-4a82-1b2b-2c44-68b8b2a4e31d > D:0917a1a3249e|dispatcher] Unknown rpc > "unknown-message-e2791ba2-4a82-1b2b-2c44-68b8b2a4e31d" > [20110717T03:25:22.414Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:e503a8fd11e7 created by task > R:524e127c00dc > [20110717T03:25:22.421Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-cabb0496-9c71-d2a4-f4b7-1e901efc0318 > D:1952f9dc400a|dispatcher] Unknown rpc > "unknown-message-cabb0496-9c71-d2a4-f4b7-1e901efc0318" > [20110717T03:25:22.424Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_by_uuid D:b45d956a9969 created by task > R:524e127c00dc > [20110717T03:25:22.431Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-b2164bef-c3b7-9376-da4e-9b03d67fc29f > D:4262505e022c|dispatcher] Unknown rpc > "unknown-message-b2164bef-c3b7-9376-da4e-9b03d67fc29f" > [20110717T03:25:22.434Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:VDI.get_SR D:c762a3654966 created by task R:524e127c00dc > [20110717T03:25:22.441Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-05524297-c362-5c92-9bc0-fd7ff7c0d2f8 > D:5b7c4ef29ee1|dispatcher] Unknown rpc > "unknown-message-05524297-c362-5c92-9bc0-fd7ff7c0d2f8" > [20110717T03:25:22.444Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_uuid D:9af3cc84037f created by task R:524e127c00dc > [20110717T03:25:22.450Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-9bd0df06-96c5-b337-c171-81c5f35f7208 > D:1b1be302da52|dispatcher] Unknown rpc > "unknown-message-9bd0df06-96c5-b337-c171-81c5f35f7208" > [20110717T03:25:22.454Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:SR.get_by_uuid D:98ee43cfd1f3 created by task > R:524e127c00dc > [20110717T03:25:22.461Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-c271b4de-3409-84ed-4441-a69c4d06bada > D:65836aad2fda|dispatcher] Unknown rpc > "unknown-message-c271b4de-3409-84ed-4441-a69c4d06bada" > [20110717T03:25:22.464Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SR.get_type D:fd9ca00af80a created by task R:524e127c00dc > [20110717T03:25:22.470Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-31aa1014-9595-fb4d-0d5c-97e63fb488e7 > D:b80255077d67|dispatcher] Unknown rpc > "unknown-message-31aa1014-9595-fb4d-0d5c-97e63fb488e7" > [20110717T03:25:22.473Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:SM.get_all_records_where D:86eb822f25f7 created by task > R:524e127c00dc > [20110717T03:25:22.483Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-7c3182c5-325e-fa8f-8afd-5181a8697278 > D:f47892577d4a|dispatcher] Unknown rpc > "unknown-message-7c3182c5-325e-fa8f-8afd-5181a8697278" > [20110717T03:25:22.486Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:SM.get_driver_filename D:955957f17b89 created by task > R:524e127c00dc > [20110717T03:25:22.493Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-ec61df76-912b-5aae-c9da-25c7c947fb3c > D:31f15425ea50|dispatcher] Unknown rpc > "unknown-message-ec61df76-912b-5aae-c9da-25c7c947fb3c" > [20110717T03:25:22.496Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:VM.get_all_records_where D:83a00dbee0fd created by task > R:524e127c00dc > [20110717T03:25:22.510Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-24e59ec8-94ce-dada-e5db-9069bbdd0980 > D:e35591a8faf9|dispatcher] Unknown rpc > "unknown-message-24e59ec8-94ce-dada-e5db-9069bbdd0980" > [20110717T03:25:22.513Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records_where D:a27cc948a937 created by task > R:524e127c00dc > [20110717T03:25:22.520Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-9b8e6ed2-1d5e-b28a-fdf4-6a37835d7963 > D:b7b17293b509|dispatcher] Unknown rpc > "unknown-message-9b8e6ed2-1d5e-b28a-fdf4-6a37835d7963" > [20110717T03:25:22.523Z|debug|srv-xh5|941 inet-RPC||dummytaskhelper] > task dispatch:PBD.get_device_config D:7fc588726092 created by task > R:524e127c00dc > [20110717T03:25:22.530Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-9c2718d0-2a78-d2f0-11b0-387715ed6e22 > D:efa785340286|dispatcher] Unknown rpc > "unknown-message-9c2718d0-2a78-d2f0-11b0-387715ed6e22" > [20110717T03:25:22.540Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-b4b2ccfa-3b63-04a7-ca83-786871db76b9 > D:b5c58a02e594|dispatcher] Unknown rpc > "unknown-message-b4b2ccfa-3b63-04a7-ca83-786871db76b9" > [20110717T03:25:22.551Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-61af23b7-d6e0-688b-9b4e-5cf226ca9261 > D:00e98908b99a|dispatcher] Unknown rpc > "unknown-message-61af23b7-d6e0-688b-9b4e-5cf226ca9261" > [20110717T03:25:22.565Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-c4f68dab-56db-1371-48cc-3bb35d971fdb > D:cf528a20b69f|dispatcher] Unknown rpc > "unknown-message-c4f68dab-56db-1371-48cc-3bb35d971fdb" > [20110717T03:25:22.576Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-d481656e-8cdc-b92a-c31e-33a22472c414 > D:562e10cfcd2f|dispatcher] Unknown rpc > "unknown-message-d481656e-8cdc-b92a-c31e-33a22472c414" > [20110717T03:25:22.620Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-481dbfb5-a905-f846-ebb8-f731c506b885 > D:8835fe2b52fe|dispatcher] Unknown rpc > "unknown-message-481dbfb5-a905-f846-ebb8-f731c506b885" > [20110717T03:25:22.636Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-1d870de2-68d2-1da8-ed63-cf5bb467576d > D:a3202584d31a|dispatcher] Unknown rpc > "unknown-message-1d870de2-68d2-1da8-ed63-cf5bb467576d" > [20110717T03:25:22.645Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-a45485d8-6bae-ed18-4dce-fd18b6898808 > D:234cad4e2f96|dispatcher] Unknown rpc > "unknown-message-a45485d8-6bae-ed18-4dce-fd18b6898808" > [20110717T03:25:22.695Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-6c8cef2e-e2e1-4c21-73ce-60fede66328f > D:89bb5da8f980|dispatcher] Unknown rpc > "unknown-message-6c8cef2e-e2e1-4c21-73ce-60fede66328f" > [20110717T03:25:22.705Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-7bd78c4f-5e90-1a06-8b25-e3cd517f22e7 > D:5be67a7a8653|dispatcher] Unknown rpc > "unknown-message-7bd78c4f-5e90-1a06-8b25-e3cd517f22e7" > [20110717T03:25:22.754Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-e10e4fe4-e9e0-4d48-1009-0d46cfcc47c4 > D:2165286a6678|dispatcher] Unknown rpc > "unknown-message-e10e4fe4-e9e0-4d48-1009-0d46cfcc47c4" > [20110717T03:25:22.848Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-6fe7a47f-58bf-fc35-43cb-e0ac2c601b13 > D:ea44b70baa57|dispatcher] Unknown rpc > "unknown-message-6fe7a47f-58bf-fc35-43cb-e0ac2c601b13" > [20110717T03:25:22.852Z| info|srv-xh5|941 > inet-RPC|dispatch:VDI.remove_from_xenstore_data > D:fb2cd6d7e05a|api_effect] VDI.remove_from_xenstore_data > [20110717T03:25:22.859Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-cd0d5837-211d-0b4d-c82a-cec4b00fcd9d > D:8b6e130fc92c|dispatcher] Unknown rpc > "unknown-message-cd0d5837-211d-0b4d-c82a-cec4b00fcd9d" > [20110717T03:25:22.862Z| info|srv-xh5|769 > inet-RPC|dispatch:VDI.remove_from_xenstore_data > D:1f574cdac0fa|api_effect] VDI.remove_from_xenstore_data > [20110717T03:25:22.868Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-c854d6b7-9e2f-a053-885f-92d47200af0e > D:3a1eb9a57b73|dispatcher] Unknown rpc > "unknown-message-c854d6b7-9e2f-a053-885f-92d47200af0e" > [20110717T03:25:22.872Z| info|srv-xh5|941 > inet-RPC|dispatch:VDI.remove_from_xenstore_data > D:50b18054e7c2|api_effect] VDI.remove_from_xenstore_data > [20110717T03:25:22.878Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-5828f761-f246-b296-d7ac-6fd545c9523a > D:e746b3c1afff|dispatcher] Unknown rpc > "unknown-message-5828f761-f246-b296-d7ac-6fd545c9523a" > [20110717T03:25:22.882Z| info|srv-xh5|769 > inet-RPC|dispatch:VDI.remove_from_xenstore_data > D:32ce62f2351b|api_effect] VDI.remove_from_xenstore_data > [20110717T03:25:22.887Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-96e9321f-cd26-4602-783a-6108ddf5f60e > D:5c50553f6986|dispatcher] Unknown rpc > "unknown-message-96e9321f-cd26-4602-783a-6108ddf5f60e" > [20110717T03:25:22.890Z| info|srv-xh5|941 > inet-RPC|dispatch:VDI.add_to_xenstore_data D:af32f2a61c7d|api_effect] > VDI.add_to_xenstore_data > [20110717T03:25:22.896Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-1fbf77fc-4f7c-0fe8-8bab-6514313f8d99 > D:8869860dc5c2|dispatcher] Unknown rpc > "unknown-message-1fbf77fc-4f7c-0fe8-8bab-6514313f8d99" > [20110717T03:25:22.900Z| info|srv-xh5|769 > inet-RPC|dispatch:VDI.add_to_xenstore_data D:ce0845d4e2a4|api_effect] > VDI.add_to_xenstore_data > [20110717T03:25:22.906Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-a8848c70-f812-3fa7-8fa9-cf8e71d172d6 > D:313535d0b437|dispatcher] Unknown rpc > "unknown-message-a8848c70-f812-3fa7-8fa9-cf8e71d172d6" > [20110717T03:25:22.909Z| info|srv-xh5|941 > inet-RPC|dispatch:VDI.add_to_xenstore_data D:12ab872bca8b|api_effect] > VDI.add_to_xenstore_data > [20110717T03:25:22.916Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-c58fca96-03f3-a982-903a-17f4efaae085 > D:f82c4dcd4074|dispatcher] Unknown rpc > "unknown-message-c58fca96-03f3-a982-903a-17f4efaae085" > [20110717T03:25:22.919Z| info|srv-xh5|769 > inet-RPC|dispatch:VDI.remove_from_xenstore_data > D:80846e1bb1ba|api_effect] VDI.remove_from_xenstore_data > [20110717T03:25:22.925Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-04fe9fef-c43c-af37-2402-0fc5ef2c77ca > D:f00eb080a6b6|dispatcher] Unknown rpc > "unknown-message-04fe9fef-c43c-af37-2402-0fc5ef2c77ca" > [20110717T03:25:22.929Z| info|srv-xh5|941 > inet-RPC|dispatch:VDI.remove_from_xenstore_data > D:3ce94a530657|api_effect] VDI.remove_from_xenstore_data > [20110717T03:25:22.935Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-6b9aa06d-663b-6e57-cff7-d90bbcfc7ec1 > D:903dc56876e8|dispatcher] Unknown rpc > "unknown-message-6b9aa06d-663b-6e57-cff7-d90bbcfc7ec1" > [20110717T03:25:22.938Z| info|srv-xh5|769 > inet-RPC|dispatch:VDI.add_to_xenstore_data D:4a8cc5d3f569|api_effect] > VDI.add_to_xenstore_data > [20110717T03:25:22.945Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-53a68458-5014-b6e3-f7d1-3f7bd7d21c30 > D:feaf7a5008b9|dispatcher] Unknown rpc > "unknown-message-53a68458-5014-b6e3-f7d1-3f7bd7d21c30" > [20110717T03:25:22.948Z| info|srv-xh5|941 > inet-RPC|dispatch:VDI.add_to_xenstore_data D:b23fef1a8024|api_effect] > VDI.add_to_xenstore_data > [20110717T03:25:23.351Z|debug|srv-xh5|506 inet-RPC||xapi] Raised at > db_cache_types.ml:75.27-76 -> db_cache_types.ml:118.2-40 -> > pervasiveext.ml:22.2-9 > [20110717T03:25:23.811Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-ff7e2de6-99fc-750c-583a-25da8c269a71 > D:03b6fca3f655|dispatcher] Unknown rpc > "unknown-message-ff7e2de6-99fc-750c-583a-25da8c269a71" > [20110717T03:25:23.814Z|debug|srv-xh5|769 inet-RPC||dummytaskhelper] > task dispatch:VM.atomic_set_resident_on D:0c2315ff6cf1 created by task > R:524e127c00dc > [20110717T03:25:23.817Z| info|srv-xh5|769 > inet-RPC|dispatch:VM.atomic_set_resident_on D:0c2315ff6cf1|taskhelper] > task VM.atomic_set_resident_on R:ff0239377c59 > (uuid:a69c3e66-efad-d7b3-f955-d5f45c62e6d7) created > (trackid=fe7e3d8d48fcd164b2a29b75c4a2147b) by task R:524e127c00dc > [20110717T03:25:23.817Z|debug|srv-xh5|769 > inet-RPC|VM.atomic_set_resident_on R:ff0239377c59|audit] > VM.atomic_set_resident_on: VM = 'a0e508db-fb6d-4ad9-4d68-9e6d20a0d2c5 > (1499_5393)' > [20110717T03:25:23.839Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender cleaning up by destroying remains of local > domain > [20110717T03:25:23.840Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Domain.destroy: all known devices = [ frontend > (domid=1 | kind=vbd | devid=51760); backend (domid=0 | kind=vbd | > devid=51760); frontend (domid=1 | kind=vbd | devid=51712); backend > (domid=0 | kind=vbd | devid=51712); frontend (domid=1 | kind=vif | > devid=0); backend (domid=0 | kind=vif | devid=0) ] > [20110717T03:25:23.840Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Domain.destroy calling Xc.domain_destroy (domid 1) > [20110717T03:25:24.001Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] No qemu-dm pid in xenstore; assuming this domain > was PV > [20110717T03:25:24.001Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.hard_shutdown frontend (domid=1 | > kind=vbd | devid=51760); backend (domid=0 | kind=vbd | devid=51760) > [20110717T03:25:24.001Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.request_shutdown frontend (domid=1 | > kind=vbd | devid=51760); backend (domid=0 | kind=vbd | devid=51760) force > [20110717T03:25:24.001Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-write > /local/domain/0/backend/vbd/1/51760/shutdown-request = force > [20110717T03:25:24.002Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /local/domain/0/backend/vbd/1/51760/shutdown-done ] with timeout > 1200.000000 seconds > [20110717T03:25:24.004Z|debug|srv-xh5|326 xal_listen||event] VM (domid: > 1) device_event = device shutdown {vbd,51760} > [20110717T03:25:24.005Z|debug|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = device shutdown {vbd,51760} D:c5aaca9f5d15|event] Adding > Resync.vbd to queue > [20110717T03:25:24.005Z|debug|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = device shutdown {vbd,51760} > D:c5aaca9f5d15|locking_helpers] push(per-VM queue, DevShutdownDone(vbd, > 51760) domid: 1); queue = [ DevShutdownDone(vbd, 51712) domid: 1; > DevShutdownDone(vbd, 51760) domid: 1 ](2) > [20110717T03:25:24.005Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.rm_device_state frontend (domid=1 | > kind=vbd | devid=51760); backend (domid=0 | kind=vbd | devid=51760) > [20110717T03:25:24.005Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/device/vbd/51760 > [20110717T03:25:24.006Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/0/backend/vbd/1/51760 > [20110717T03:25:24.006Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/0/error/backend/vbd/1 > [20110717T03:25:24.006Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/error/device/vbd/51760 > [20110717T03:25:24.007Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.hard_shutdown complete > [20110717T03:25:24.007Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.hard_shutdown frontend (domid=1 | > kind=vbd | devid=51712); backend (domid=0 | kind=vbd | devid=51712) > [20110717T03:25:24.007Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.request_shutdown frontend (domid=1 | > kind=vbd | devid=51712); backend (domid=0 | kind=vbd | devid=51712) force > [20110717T03:25:24.007Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-write > /local/domain/0/backend/vbd/1/51712/shutdown-request = force > [20110717T03:25:24.007Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /local/domain/0/backend/vbd/1/51712/shutdown-done ] with timeout > 1200.000000 seconds > [20110717T03:25:24.008Z|debug|srv-xh5|326 xal_listen||event] VM (domid: > 1) device_event = device shutdown {vbd,51712} > [20110717T03:25:24.009Z|debug|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = device shutdown {vbd,51712} D:1620aa41d366|event] Adding > Resync.vbd to queue > [20110717T03:25:24.009Z|debug|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = device shutdown {vbd,51712} > D:1620aa41d366|locking_helpers] push(per-VM queue, DevShutdownDone(vbd, > 51712) domid: 1); queue = [ DevShutdownDone(vbd, 51712) domid: 1; > DevShutdownDone(vbd, 51760) domid: 1; DevShutdownDone(vbd, 51712) domid: > 1 ](3) > [20110717T03:25:24.009Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.rm_device_state frontend (domid=1 | > kind=vbd | devid=51712); backend (domid=0 | kind=vbd | devid=51712) > [20110717T03:25:24.010Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/device/vbd/51712 > [20110717T03:25:24.010Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/0/backend/vbd/1/51712 > [20110717T03:25:24.011Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/0/error/backend/vbd/1 > [20110717T03:25:24.011Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/error/device/vbd/51712 > [20110717T03:25:24.011Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vbd.hard_shutdown complete > [20110717T03:25:24.011Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vif.hard_shutdown frontend (domid=1 | > kind=vif | devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:24.011Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-write > /local/domain/0/backend/vif/1/0/online = 0 > [20110717T03:25:24.011Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vif.hard_shutdown about to blow away > frontend > [20110717T03:25:24.012Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/device/vif/0 > [20110717T03:25:24.012Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /xapi/1/hotplug/vif/0/hotplug ] with timeout 1200.000000 seconds > [20110717T03:25:24.686Z|debug|srv-xh5|326 xal_listen||event] VM (domid: > 1) device_event = HotplugChanged on 0 {online->""} > [20110717T03:25:24.687Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.Vif.hard_shutdown about to blow away > backend and error paths > [20110717T03:25:24.687Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Device.rm_device_state frontend (domid=1 | > kind=vif | devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:24.687Z|error|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = HotplugChanged on 0 {online->""} D:ec045c4a68c9|event] > device_event could not be processed because VM record not in database > [20110717T03:25:24.687Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/device/vif/0 > [20110717T03:25:24.687Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/0/backend/vif/1/0 > [20110717T03:25:24.688Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/0/error/backend/vif/1 > [20110717T03:25:24.688Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] xenstore-rm /local/domain/1/error/device/vif/0 > 20110717T03:25:24.688Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.release: frontend (domid=1 | kind=vbd | > devid=51760); backend (domid=0 | kind=vbd | devid=51760) > [20110717T03:25:24.688Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.wait_for_unplug: frontend (domid=1 | > kind=vbd | devid=51760); backend (domid=0 | kind=vbd | devid=51760) > [20110717T03:25:24.688Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /xapi/1/hotplug/vbd/51760/hotplug ] with timeout 1200.000000 seconds > [20110717T03:25:24.689Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Synchronised ok with hotplug script: frontend > (domid=1 | kind=vbd | devid=51760); backend (domid=0 | kind=vbd | > devid=51760) > [20110717T03:25:24.689Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.release releasing /dev/loop7 > [20110717T03:25:24.696Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.release: frontend (domid=1 | kind=vbd | > devid=51712); backend (domid=0 | kind=vbd | devid=51712) > [20110717T03:25:24.696Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.wait_for_unplug: frontend (domid=1 | > kind=vbd | devid=51712); backend (domid=0 | kind=vbd | devid=51712) > [20110717T03:25:24.696Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /xapi/1/hotplug/vbd/51712/hotplug ] with timeout 1200.000000 seconds > [20110717T03:25:24.697Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Synchronised ok with hotplug script: frontend > (domid=1 | kind=vbd | devid=51712); backend (domid=0 | kind=vbd | > devid=51712) > [20110717T03:25:24.697Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.release: frontend (domid=1 | kind=vif | > devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:24.697Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Hotplug.wait_for_unplug: frontend (domid=1 | > kind=vif | devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:24.697Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] watch: watching xenstore paths: [ > /xapi/1/hotplug/vif/0/hotplug ] with timeout 1200.000000 seconds > [20110717T03:25:24.697Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Synchronised ok with hotplug script: frontend > (domid=1 | kind=vif | devid=0); backend (domid=0 | kind=vif | devid=0) > [20110717T03:25:24.697Z| warn|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|hotplug] Warning, deleting 'vif' entry from > /xapi/1/hotplug/vif/0 > [20110717T03:25:24.698Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Domain.destroy: rm /local/domain/1 > [20110717T03:25:24.698Z|debug|srv-xh5|326 xal_listen||event] VM (domid: > 1) device_event = ChangeUncooperative false > [20110717T03:25:24.699Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Domain.destroy: deleting backend paths > [20110717T03:25:24.702Z|error|srv-xh5|326 xal_listen|VM (domid: 1) > device_event = ChangeUncooperative false D:bd8f207fa01d|event] > device_event could not be processed because VM record not in database > [20110717T03:25:24.702Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Xc.domain_getinfo 1 threw: getinfo failed: domain > 1: getinfo failed: domain 1: getinfo failed: domain 1: hypercall 36 > fail: 11: Resource temporarily unavailable (ret -1) -- assuming domain > nolonger exists > [20110717T03:25:24.702Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xenops] Xc.domain_getinfo 1 threw: getinfo failed: domain > 1: getinfo failed: domain 1: getinfo failed: domain 1: getinfo failed: > domain 1: hypercall 36 fail: 11: Resource temporarily unavailable (ret > -1) -- assuming domain nolonger exists > [20110717T03:25:24.702Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|storage_access] vdi refcount violation (on detach): > hashtbl reports '0' for VDI '2c6f0960-2742-48ab-9935-4ddb2d5fcb28' > [20110717T03:25:24.702Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|sm] SM lvmoiscsi vdi_detach > sr=OpaqueRef:a1077822-9453-de43-0efa-ecb0fe508f59 > vdi=OpaqueRef:c34b2134-fe3e-0780-a5bf-12e4cd3edd4e > [20110717T03:25:24.703Z|debug|srv-xh5|861 inet-RPC|sm_exec > D:dae4922e7f32|xapi] Raised at db_cache_types.ml:75.27-76 -> > db_cache_types.ml:118.2-40 -> pervasiveext.ml:22.2-9 > [20110717T03:25:24.706Z| info|srv-xh5|861 inet-RPC|sm_exec > D:68e0f95eaa4c|xapi] Session.create > trackid=d877d75ed8b11fa0707952ac666775e3 pool=false uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:24.708Z|debug|srv-xh5|944 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:800b77fd9b6b created by task > D:68e0f95eaa4c > [20110717T03:25:24.786Z|debug|srv-xh5|945 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:d5d2dd28b15e created by task > R:524e127c00dc > [20110717T03:25:24.791Z|debug|srv-xh5|946 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:69c9b6712ebe created by task > R:524e127c00dc > [20110717T03:25:24.796Z|debug|srv-xh5|947 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:c64123009034 created by task > R:524e127c00dc > [20110717T03:25:24.804Z|debug|srv-xh5|948 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:774ef164c97c created by task > R:524e127c00dc > [20110717T03:25:24.809Z|debug|srv-xh5|949 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:96b14ab38fff created by task > R:524e127c00dc > [20110717T03:25:24.850Z|debug|srv-xh5|950 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:6bba887bd3c5 created by task > R:524e127c00dc > [20110717T03:25:24.859Z|debug|srv-xh5|951 unix-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:1cefb90e3af2 created by task > R:524e127c00dc > [20110717T03:25:24.863Z|debug|srv-xh5|952 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:330984d04c06 created by task > R:524e127c00dc > [20110717T03:25:25.019Z|debug|srv-xh5|953 unix-RPC||dummytaskhelper] > task dispatch:SR.get_other_config D:9782795ccb43 created by task > R:524e127c00dc > [20110717T03:25:25.024Z|debug|srv-xh5|954 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:8f7cd02d4617 created by task > R:524e127c00dc > [20110717T03:25:25.104Z|debug|srv-xh5|955 unix-RPC||dummytaskhelper] > task dispatch:VDI.get_sm_config D:18960ea23f60 created by task > R:524e127c00dc > [20110717T03:25:25.116Z| info|srv-xh5|861 inet-RPC|sm_exec > D:68e0f95eaa4c|xapi] Session.destroy > trackid=d877d75ed8b11fa0707952ac666775e3 > [20110717T03:25:25.116Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|storage_access] Executed detach succesfully on VDI > '2c6f0960-2742-48ab-9935-4ddb2d5fcb28'; attach refcount now: 0 > [20110717T03:25:25.117Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|sm] SM iso vdi_detach > sr=OpaqueRef:47c6426c-0ef1-dfa5-8730-1fde4210814a > vdi=OpaqueRef:20abc7c0-7fb7-4f7d-1bd7-8f08e6331cf8 > [20110717T03:25:25.118Z|debug|srv-xh5|861 inet-RPC|sm_exec > D:df56ae240a2d|xapi] Raised at db_cache_types.ml:75.27-76 -> > db_cache_types.ml:118.2-40 -> pervasiveext.ml:22.2-9 > [20110717T03:25:25.120Z| info|srv-xh5|861 inet-RPC|sm_exec > D:17f564e2566b|xapi] Session.create > trackid=479aa14f2dfb536c21505cb28371af06 pool=false uname= > is_local_superuser=true auth_user_sid= > parent=trackid=9834f5af41c964e225f24279aefe4e49 > [20110717T03:25:25.122Z|debug|srv-xh5|956 unix-RPC||dummytaskhelper] > task dispatch:session.get_uuid D:f69023151d55 created by task > D:17f564e2566b > [20110717T03:25:25.198Z|debug|srv-xh5|957 unix-RPC||dummytaskhelper] > task dispatch:host.get_other_config D:a42dc9aa7e0a created by task > R:524e127c00dc > [20110717T03:25:25.203Z|debug|srv-xh5|958 unix-RPC||dummytaskhelper] > task dispatch:PBD.get_all_records D:b34b75f199d2 created by task > R:524e127c00dc > [20110717T03:25:25.212Z|debug|srv-xh5|959 unix-RPC||dummytaskhelper] > task dispatch:SR.get_sm_config D:26988c9740f9 created by task > R:524e127c00dc > [20110717T03:25:25.222Z| info|srv-xh5|861 inet-RPC|sm_exec > D:17f564e2566b|xapi] Session.destroy > trackid=479aa14f2dfb536c21505cb28371af06 > [20110717T03:25:25.222Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|storage_access] Executed detach succesfully on VDI > '017fb470-d643-4f44-abeb-8b8749aeb114'; attach refcount now: 0 > [20110717T03:25:25.222Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 8.Logging out of remote server > [20110717T03:25:25.224Z|debug|srv-xh5|960 unix-RPC||dummytaskhelper] > task dispatch:session.logout D:78f8ff3daa03 created by task > R:524e127c00dc > [20110717T03:25:25.226Z| info|srv-xh5|960 unix-RPC|session.logout > D:8509467bc136|xapi] Session.destroy > trackid=fe7e3d8d48fcd164b2a29b75c4a2147b > [20110717T03:25:25.226Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Sender 9. Closing memory image transfer socket > [20110717T03:25:25.226Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|locking_helpers] pop(per-VM queue) = DevShutdownDone(vbd, > 51712) domid: 1 > [20110717T03:25:25.227Z|debug|srv-xh5|861 inet-RPC|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > DevShutdownDone(vbd, 51712) domid: 1 D:ffac7ef4b754|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf (1499_5393) resident_on > other host OpaqueRef:4f82279c-1222-368c-4e8f-8667e88485d9 (srv-xh5): > taking no action > [20110717T03:25:25.227Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|locking_helpers] pop(per-VM queue) = DevShutdownDone(vbd, > 51760) domid: 1 > [20110717T03:25:25.228Z|debug|srv-xh5|861 inet-RPC|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > DevShutdownDone(vbd, 51760) domid: 1 D:0a90204d1a02|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf (1499_5393) resident_on > other host OpaqueRef:4f82279c-1222-368c-4e8f-8667e88485d9 (srv-xh5): > taking no action > [20110717T03:25:25.228Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|locking_helpers] pop(per-VM queue) = DevShutdownDone(vbd, > 51712) domid: 1 > [20110717T03:25:25.229Z|debug|srv-xh5|861 inet-RPC|VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf: processing > DevShutdownDone(vbd, 51712) domid: 1 D:4c11e5169e18|event] VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf (1499_5393) resident_on > other host OpaqueRef:4f82279c-1222-368c-4e8f-8667e88485d9 (srv-xh5): > taking no action > [20110717T03:25:25.229Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|locking_helpers] Released lock on VM > OpaqueRef:dd860476-057f-2b22-6120-7668828720cf with token 6 > [20110717T03:25:25.231Z|debug|srv-xh5|872||thread_queue] > long_running_op: completed processing 1 items: queue = [ ](0) > [20110717T03:25:25.234Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Checking for vdis_reset_and_caching... > [20110717T03:25:25.234Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Op allowed! > [20110717T03:25:25.235Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Checking for vdis_reset_and_caching... > [20110717T03:25:25.235Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Op allowed! > [20110717T03:25:25.237Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Checking for vdis_reset_and_caching... > [20110717T03:25:25.237Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|xapi] Op allowed! > [20110717T03:25:25.243Z|debug|srv-xh5|861 inet-RPC|VM.pool_migrate > R:524e127c00dc|taskhelper] the status of R:524e127c00dc is: success; > cannot set it to `success > [20110717T03:25:25.252Z|debug|srv-xh5|861 > inet-RPC|dispatch:unknown-message-f5de3dd8-d90b-b00e-bb70-9fca4f551a89 > D:f1ebe51d335b|dispatcher] Unknown rpc > "unknown-message-f5de3dd8-d90b-b00e-bb70-9fca4f551a89" > [20110717T03:25:25.255Z|debug|srv-xh5|941 > inet-RPC|dispatch:unknown-message-86550759-61bd-e61b-6c44-dd4f61433fda > D:ad521ab1c607|dispatcher] Unknown rpc > "unknown-message-86550759-61bd-e61b-6c44-dd4f61433fda" > [20110717T03:25:25.260Z| info|srv-xh5|941 inet-RPC|session.logout > D:e23d456856c8|xapi] Session.destroy > trackid=c82b070d193b45e7baf3630b3cd6d2ff > [20110717T03:26:10.878Z|debug|srv-xh5|769 > inet-RPC|dispatch:unknown-message-d19d13ac-931d-f95b-590f-b3e0d546e603 > D:db9820d7a192|dispatcher] Unknown rpc > "unknown-message-d19d13ac-931d-f95b-590f-b3e0d546e603" > > > > _______________________________________________ > xen-api mailing list > xen-api@xxxxxxxxxxxxxxxxxxx > http://lists.xensource.com/mailman/listinfo/xen-api _______________________________________________ xen-api mailing list xen-api@xxxxxxxxxxxxxxxxxxx http://lists.xensource.com/mailman/listinfo/xen-api
|
Lists.xenproject.org is hosted with RackSpace, monitoring our |