you can use the FFmpeg library for this, but you have to code against
it, not just
use it from the command line as far as I know.
You basically need to get a video decoder from a full sample file,
then use that video decoder to decode the raw stream. Then you won't
need the moov atom (since codec settings will be loaded from the full
sample file).
Here it is assumed that the sample file and the raw stream are
created
with the same codecs in the Android app.
Here is a small project interfacing FFmpeg with a related aim:
http://vcg.isti.cnr.it/~ponchio/life/code/untrunk_readme.html
A FFmpeg tutorial is available here:
http://dranger.com/ffmpeg/
Best of luck,
Ivar
On Jul 22, 11:46 am, RFuente <fuente.mu...@gmail.com> wrote:
> Hi everyone.
>
> I'm trying to do live-recordstreamingfrom an Android phone and got
> stuck in a problem quite difficult to tackle. After messing around
> with an hex editor, I've found that MediaRecorder class places the
> moov atom at the end of the MP4 file. As this atom is necessary to
> work with the MP4 file and I'm sending the stream while recording, I'm
> 'blind' at the server side until the transfer is completed.
>
> The idea is to start playing/coding/whatever the file in the server
> side in real time but the received file only contains mdat bytes at
> this moment. Has anyone faced a similar problem? Do you have any idea
> if this task can be accomplished?
>
> Thanks beforehand.
--
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to android-developers@googlegroups.com
To unsubscribe from this group, send email to
android-developers+unsubscribe@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
No comments:
Post a Comment