很晚,但我认为这是一个很好的问题,它还没有一个很好的答案。
如果你想把照相机和从roid子装置中分离出来,你有两个主要选择: Java或NDK执行。
- Java implementation.
I m only going to mention the idea but basically it is implement an RTSP Server and RTP Protocol in java based on these standards Real-Time Streaming Protocol Version 2.0 and RTP Payload Format for H.264 Video. This task will be very long and hard. But if you are doing your PhP it could be nice to have a nice RTSP Java lib for Android.
- NDK implementation.
This is alternative include various solutions. The main idea is to use a power C or C++ library in our Android application. For this instance, FFmpeg. This library can be compiled for Android and may support various architectures.
The problem of this approach is that you may need to learn about the Android NDK, C and C++ to accomplish this.
But there is an alternative. You can wrap the c library and use the FFmpeg. But how?
例如,使用FFmpeg Anders/a, 编集了x264, libass, fontconfig, freetype and fribidi,并支持各种建筑。 但是,如果你想要实时上流,你需要处理文件说明和在/外流。
The best alternative, from a Java programming point of view, is to use JavaCV. JavaCV uses wrappers from commonly used libraries of computer vision that includes: (OpenCV, FFmpeg, etc, and provides utility classes to make their functionality easier to use on the Java platform, including (of course) Android.
JavaCV also comes with hardware accelerated full-screen image display (CanvasFrame
and GLCanvasFrame
), easy-to-use methods to execute code in parallel on multiple cores (Parallel
), user-friendly geometric and color calibration of cameras and projectors (GeometricCalibrator
, ProCamGeometricCalibrator
, ProCamColorCalibrator
), detection and matching of feature points (ObjectFinder
), a set of classes that implement direct image alignment of projector-camera systems (mainly GNImageAligner
, ProjectiveTransformer
, ProjectiveColorTransformer
, ProCamTransformer
, and ReflectanceInitializer
), a blob analysis package (Blobs
), as well as miscellaneous functionality in the JavaCV
class. Some of these classes also have an OpenCL and OpenGL counterpart, their names ending with CL
or starting with GL
, i.e.: JavaCVCL
, GLCanvasFrame
, etc.
但我们如何利用这一解决办法?
在这方面,我们有利用民主力量联盟的基本执行情况。
String streamURL = "udp://ip_destination:port";
recorder = new FFmpegFrameRecorder(streamURL, frameWidth, frameHeight, 1);
recorder.setInterleaved(false);
// video options //
recorder.setFormat("mpegts");
recorder.setVideoOption("tune", "zerolatency");
recorder.setVideoOption("preset", "ultrafast");
recorder.setVideoBitrate(5 * 1024 * 1024);
recorder.setFrameRate(30);
recorder.setSampleRate(AUDIO_SAMPLE_RATE);
recorder.setVideoCodec(AV_CODEC_ID_H264);
recorder.setAudioCodec(AV_CODEC_ID_AAC);
该守则的这一部分表明,如何启动称为记录器的FFmpegFrameRecorder物体。 该物体将捕获和编码从摄像机和从麦克风获得的样本中获取的框架。
如果你想在同一个安乐器中捕获一个预览器,我们就需要实施一个照相机级,这一级将转换从照相机中获取的原始数据,并将为FFmpegFrameRecorder创建笔试和框架。
将ip子替换为你希望寄出的 p子或装置。 例如,港口可以是8080个。
@Override
public Mat onCameraFrame(Mat mat)
{
if (audioRecordRunnable == null) {
startTime = System.currentTimeMillis();
return mat;
}
if (recording && mat != null) {
synchronized (semaphore) {
try {
Frame frame = converterToMat.convert(mat);
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(frame);
} catch (FFmpegFrameRecorder.Exception e) {
LogHelper.i(TAG, e.getMessage());
e.printStackTrace();
}
}
}
return mat;
}
这种方法显示,从摄像机中获取Mat(图像)的<代码>-onCameraFrame方法的实施,该方法被转换成碎片,由FFmpegFrameRecorder物体记录。
@Override
public void onSampleReady(ShortBuffer audioData)
{
if (recorder == null) return;
if (recording && audioData == null) return;
try {
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
LogHelper.e(TAG, "audioData: " + audioData);
recorder.recordSamples(audioData);
} catch (FFmpegFrameRecorder.Exception e) {
LogHelper.v(TAG, e.getMessage());
e.printStackTrace();
}
}
带有<代码>audioData的样子是ShortBuffer
的标的,由FFmpegFrameRecorder公司登记。
In the PC or device destination you can run the following command to get the stream.
ffplay udp://ip_source:port
The ip_source
is the ip of the smartphone that is streaming the UK and mic stream. 港口必须是8080。
I created a solution in my github repository here: UDPAVStreamer.
Good luck