-
-
Notifications
You must be signed in to change notification settings - Fork 127
Using the internal MacBook camera results in a streaming error #191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Also tried it with "pipeline:avfvideosrc" and "pipeline:avfvideosrc device-index=0" as names and still got the streaming error. |
Same problem in MacOS Monterey 12.0.1 with the same laptop and Processing version. |
I'm having exactly the same problem: MacOS Monterey, MacBook Air (Retina, 13-inch, 2019). |
Apparently there are more problems with Monterey and Gstreamer: https://discourse.processing.org/t/processing-new-gstreamer-error/33975/8 |
The last comment on this post led me to try a workaround which does help. If you start OBS Studio, start the virtual camera and then run the sketch, it works fine. Apparently there is some kind of mismatch between how the built-in Facetime camera should be started and what Gstreamer is trying, and putting OBS in between fixes that. Apparently this problem is not unique to Gstreamer or Processing. |
By using
for me Capture.list[0] is OBS virtual cam MacBook Pro (13-inch, 2018, Four Thunderbolt 3 Ports) |
Still get the same error, even with Capture.list()[0]. Printing out the list shows "FaceTime HD Camera (Built-in)" but the same streaming error: |
Did you tried to install OBS and configure Virtual Camera as @twisst recommand? |
I solved by writting: |
I confirm this error with external cameras also. |
Fixed with ace6d59 |
There may be an additional problem as I still get the same error with Processing 4.1.3, even when using the full method signature for a second USB camera. This code works for "FaceTime HD Camera (Built-in)", but not with a second USB camera. And I tried it with two different USB cameras. The lines printed by the printArray are: cameras = Capture.list(); The error message is still the same: BaseSrc: [avfvideosrc0] : Internal data stream error. UPDATE: It now works with this call to the Capture constructor: video = new Capture(this, width/scl, height/scl, cameras[0]); But it does not work with this: video = new Capture(this, width/scl, height/scl, cameras[0], 30); // nor does it work with , FPS) One more variable I introduced before it started working was that I installed the processing.video library for 3.5.3. But this example is now working in 4.1.3. |
I tried your code with Processing 4.2 and video library 2.2.2 and worked without any problem. |
For me the example works in 4.1.3 / video library 2.2.2 with both an external and internal camera. There is an odd thing on MacOS where I need to provide the FPS for the internal camera (30). |
Well, as I've said. I've got it working now with the external USB camera on MacOS. But given your comment quoted below, I now need to go back and see if it will work on the internal MacOS camera without the FPS parameter.
|
Hi :-) I'm trying to help someone revive an old project I did which uses a USB webcam. She uses an
In the software I wrote I specified 30 FPS but it doesn't help. Any ideas what I could try? |
Here a test program import processing.video.*;
Capture cam;
void setup() {
size(640, 480);
String[] cameras = Capture.list();
println("Available cameras:");
printArray(cameras);
cam = new Capture(this, 640, 480, cameras[0]);
cam.start();
}
void draw() {
if (cam.available()) cam.read();
image(cam, 0, 0);
} |
I don't use iOS, but when I encounter a problem like this in Windows, I used FFmpeg to show me all the USB/UVC cameras with resolutions and fps used. I pick one and use Capture with a pipeline to set width, height, and fps for that particular camera I want. video = new Capture(this, cameraWidth, cameraHeight, pipeline); ksvideosrc is a Windows only source. There may be version of ffmpeg on iOS. http://ffmpeg.org/download.html#build-mac |
Try adding the framerate at the end. It helped me to handle this error.
import processing.video.*;
Capture cam;void setup() {
size(640, 480);
String[] cameras = Capture.list();
println("Available cameras:");
printArray(cameras);
cam = new Capture(this, 640, 480, cameras[0], 30);
cam.start();
}void draw() {
if (cam.available()) cam.read();
image(cam, 0, 0);
}
El jue, 3 ago 2023 a la(s) 06:24, Abe Pazos ***@***.***)
escribió:
… Just adding here the OS info / specs of the problematic computer:
[image: image]
<https://user-images.githubusercontent.com/108264/258066254-ded72068-e95e-4321-897d-50d802e0d815.png>
—
Reply to this email directly, view it on GitHub
<#191 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AADUDK7LKBBBTWLERBFYRETXTNU5NANCNFSM5LU47PRQ>
.
You are receiving this because you commented.Message ID:
***@***.***>
--
.-. .:· .-_ .-: º+ ∞
Website: http://www.federicojoselevich.com/ ≈ http://www.ludic.cc/
[image: logo-google.gif]
|
The problems in two different Mac computers were solved by updating Mac OS. Otherwise both failed to run any webcam examples. |
Same thing here, on Processing 4.3 on a Macbook Pro 2020, Sonoma 14.4.1. Previous adding the FPS fixed this issue, now it's the reverse :-D Thanks for the tip. |
Sonoma 14.4
Was fixed with the proposed solution here: #208 |
Change the camera[0] to "pipeline:avfvideosrc device-index=0", then it will work, found the solution on this forum: https://lab.arts.ac.uk/books/creative-coding/page/how-to-fix-internal-data-stream-error-in-processing-4 |
Description
Using the internal camera on a MacBook results in a streaming error
Expected Behavior
Be able to use the internal MacBook camera as a Capture device
Current Behavior
Steps to Reproduce
Your Environment
Possible Causes / Solutions
The text was updated successfully, but these errors were encountered: