IDS ueye cameras

I’m trying to use two of these at once

Options I’ve tried -

Imagepack - great control of parameters, but cpu usage so heavy I can’t really run 2 at once.
uEyecam - simple node with much lower cpu usage, but no option to select cam index (so can only use one cam). Also not much control
Video In (dx11) - uses the directshow driver. poor cam control (no exposure!). Temperamental, and can’t adjust parameters from vvvv

What do you do?..

The imagepack approach seems the best but wtf is that cpu usage about??

ok, you can use two uEyeCam devices at once, but still, controls are very limited

can you show a screenshot of your high cpu usage?

Here ya go. Once with the cockpit test app, once with the ueyecam plugin, and once with the imagepack.

beta 35.8 x86 and latest imagepack, dx11

Are you sure those temps are fine? If not F°, for C° they seem pretty near limit. Maybe thermal past is exhausted.

Perhaps, but that’s a separate issue surely. The imagepack should be at least in the same performance ball park as the other approaches

Yeah, I know it’s OT, but a nicely cooled CPU could have positive general effects on the system.

hey @mrboni
tbh i have no idea why the thing consumes that much cpu-power. it wouldn’t surprise me if its much slower than the cockpit because it does much more (double-buffering the image, copy the image between at least 2 threads, and maybe convert it to dx-texture or something) But it shouldn’t be that slow - at least i didn’t experience such a thing, yet

It sounds like it’s probably my fault. The threading implementation in imagepack is quite ‘heavy’. Would be better with condition variables / thread-channels (modern inter-thread comms rather than mutex locks and waits).

Also each node should /really/ have some options how it communicates with its upstream node, e.g. select between:

  • Perform in upstream thread (least overhead/latency, but most slower frame rate, best for nodes which don’t take a lot of time)
  • Perform in own thread (what everything does now, which is great for throughput, and especially good for heavy processes)

furthermore, it would be great to output profile info (time taken in node-thread) ala DX11 ‘TimeStamp’

That said, i could happily deal with 4* 1080p 30fps HD streams on my older i7 computer with imagepack (video player into texture), so not sure why this would be so much worse than that.

Thanks for looking at this.

The GPU in my laptop is awful. Could that be causing issues?

Is there another tool you know of that can access the camera feeds and send the textures to vvvv using spout? (Maybe a tool that can grab the video from the IDS preview app windows and spoutify it?)

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.