Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
449 views
in Technique[技术] by (71.8m points)

streaming video FROM an iPhone

I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.

Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession's beginConfiguration and commitConfiguration methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:

  • The files can be directly used for HTTP live streaming without any server side processing.
  • The gap between data transfers allow the antennas to sleep in between if the connection is fast enough, saving battery life.
  • Conversely, if the connection is slow so upload is slower than recording, managing delayed upload of a set of files is much easier than a stream of bytes.

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...