Welcome, Guest
Username Password: Remember me

TOPIC: Figured out multi-cam bins ...

Re: Figured out multi-cam bins ... 7 months, 1 week ago #196216

  • smyrf
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 45
  • 7 months, 1 week ago
hugly wrote:
I suspect that both, the Tentacle software and Resolve, will need transcoding (re-encoding) in order to create synced files directly editiable with Lightworks, but as long as you stay with intermediate formats, that shouldn't have any impact to visible/audible quality. However, do you know how both solutions handle syncing video to multitrack recordings?

I seem to remember it's saved as metadata in a "sidecar" file, if that's possible? Not sure about multitrack audio... although if it's true that it only syncs with metadata I suppose there wouldn't be any issue?

Here's the latest promo clip!
www.dropbox.com/s/r4gqln3q3gfxu10/Renaissance%20%40%20Get%20Up.mp4?dl=0

Re: Figured out multi-cam bins ... 7 months, 1 week ago #196220

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 7 months, 1 week ago
Thanks for the link.
It's better to travel well than to arrive...

Re: Figured out multi-cam bins ... 7 months, 1 week ago #196233

  • RWAV
  • Moderator
    Pro User
  • OFFLINE
  • Moderator
  • Posts: 6395
  • 7 months, 1 week ago
The sound recording device - a Zoom with T/C - is a good start. Continuous T/C is essential time-of-day would be great if that is settable on your Zomm. Under no circumstances use any form or 'record run' T/C.

Work in and organise all material on a shooting day basis - a day being no more then 24 hours.
Make a folder for each shooting day and in that make folders for each acquisition device - Day 1
- Cam1, Cam2, Zoom and so on.

Everything shot on each day is put into its own acquisition device folder.

There is a caveat with the Zoom - as I recall a Zoom will not keep track of T/C if it is switched off or after battery replacement. If that is the case with the Zoom used then at each T/C reset a new Zoom 'device' must be imagined. That is easy to manage in the Zoom buy making a new Zoom directory after any T/C reset - and in the computer one may need Zoom1, Zoom2, Zoom3 virtual acquisition devices. All of this is to guarantee there are no duplicate Zoom timecodes in any one 'Day' on any single device.

Shooting - very likely that multiple people on any shoot have a potentially highly accurate T/C capable device in their pockets or handbags - a phone or a tablet - it may involve the expenditure of a few dollars for a T/C app.

Set up the T/C app device(s) somewhere it is easy to swing the camera onto it/them. In a continuous recording find a convenient time to swing over the the T/C display shoot a few seconds - this can be anywhere in the continuous recording beginning, end or anywhere.

Retain the integrity of the 'Day' structure when importing into LW.

The imported Zoom sound will have T/C.
For each imported clip - go to a frame where the T/C is readable - from the Labels Panel
enable 'Modify' and type in the numbers on the screen.

This image is hidden for guests. Please log in or register to see it.


Yes, one will have to take care that the Zoom T/C is in the appropriate format. But essentially one can easily add/modify T/C metdata for a clip in LW - we are not talking here about the original camera clips.

From this point one can use a Kemroll workflow to make 'Syncs' within LW and if necessary export/reimport those as clips, now complete with T/C in a T/C compliant non-camera high quality format. We would normally use DNxHD - but as I recall Cineform is OK too

Can follow-up with Kemroll workflow details if there is any further interest.
BETA System
Microsoft Windows 7 Professional 64BIT
HP Z800 Workstation

Re: Figured out multi-cam bins ... 7 months ago #196285

  • smyrf
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 45
  • 7 months ago
Thanks RWAV for that.
I've been absorbing a lot of information from various sources for a while now, on all aspects of filmmaking, all of which I find fascinating; but at the moment a lot of it is still theory for me, and it will take some hands-on practice for it to really sink in.

Most of what I've done so far I've been able to get away with not really having a structured approach, and I just figure it out as I go (based on the different needs of each project), but if anything, even for small projects, I find when I don't have a proper workflow in place, I still waste a lot of time figuring out what goes where, remembering where files are, on which disk, what to do with them when the project's finished, etc. Sometimes I'm just too eager to dive straight in and start editing! But I absolutely appreciate the need for "best practises" and actually do enjoy learning all the dry technical stuff which makes it all come together.

RWAV wrote:
Shooting - very likely that multiple people on any shoot have a potentially highly accurate T/C capable device in their pockets or handbags - a phone or a tablet - it may involve the expenditure of a few dollars for a T/C app.

I had looked into this sort of thing a while ago, but discounted it as not seeming very reliable. From what I understand, an iPhone app would be *in addition to* external clocks (lockit box/etc)?

Retain the integrity of the 'Day' structure when importing into LW.

Do you mean recreating this same structure in LW bins? I always heard you would typically maintain completely different structures at the file level, and in the NLE? Or would you create the bin structure for reels/camera/days but then also create other bins for scene/location, and mostly reference those while assembling an edit?

From this point one can use a Kemroll workflow to make 'Syncs' within LW and if necessary export/reimport those as clips, now complete with T/C in a T/C compliant non-camera high quality format. We would normally use DNxHD - but as I recall Cineform is OK too

Can follow-up with Kemroll workflow details if there is any further interest.


I'd gladly take you up on the offer
Actually Kemroll is something I'd tried using recently, where I had a continuous audio signal (recording of a concert) and various interrupted video files from different cameras. But I seem to have understood it requires timecode to work?

Thanks again and sorry for all the questions! I do try to figure this all out on my own as much as possible, mostly because otherwise I'd be spending all day on the various forums I follow

Re: Figured out multi-cam bins ... 7 months ago #196329

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 7 months ago
smyrf wrote:
I seem to remember it's saved as metadata in a "sidecar" file, if that's possible?

Theoretically it's possible to store any kind of metadata in separate files instead of including them in media files. On the other hand, you can store almost anything as metadata in most media containers. The question is, is it standardized? Beside of some well standardized, but by no means well supported, formats with sidecar files, like DVD and AVCHD, most existing sidecar files come with proprietary acquisition formats (e.g. ARRI RAW, XDCAM HD, RED RAW) and need proprietary software to be accessed properly. I might be corrected on this, but I would think that none of them stores or supports SMPTE T/C in the sidecar. SPMTE T/C is common standard for video editing and it comes included in the media file - thanks god, because non-standardized metadata are source of never ending confusion.

If you can find any editing workflow which uses timecode from a side car, please let us know.
It's better to travel well than to arrive...
Last Edit: 7 months ago by hugly.

Re: Figured out multi-cam bins ... 7 months ago #196512

  • smyrf
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 45
  • 7 months ago
hugly wrote:
If you can find any editing workflow which uses timecode from a side car, please let us know.


I think this just goes to show how much I still have to learn...

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196519

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 6 months, 4 weeks ago
Just to be clear, I don't say that timecode in a sidecar file doesn't exist or can't be used in principle, but I don't know any which is standardized for use with NLE's. It might well be that proprietary software exists which uses timecode in a sidecar file for proprietary workflows. So, if you hear about anything like this, it would be interesting to know.

A common misunderstanding is that timecode will solve all sync problems in post. This article explains timecode in some detail: blog.frame.io/2017/07/17/timecode-and-frame-rates/

Proper adjustments, genlocked devices and a master clock connected to all? I don't know if the effort pays for the amateur producer. I think proper adjustments, some tests, and a clapperboard will do in most cases .
It's better to travel well than to arrive...

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196559

  • RWAV
  • Moderator
    Pro User
  • OFFLINE
  • Moderator
  • Posts: 6395
  • 6 months, 4 weeks ago
smyrf apologies for the late reply.

I had looked into this sort of thing a while ago, but discounted it as not seeming very reliable. From what I understand, an iPhone app would be *in addition to* external clocks (lockit box/etc)?

It would need a proper video production type of app - showing proper H:M:S:F timecode and it is instead of external clocks (lockit box/etc)

Or would you create the bin structure for reels/camera/days
Yes, this is absolutely essential for keeping track of which piece of media mates with what.

but then also create other bins for scene/location, and mostly reference those while assembling an edit?
Yes, typically the reels/camera/days is for organisation of camera/recorder media and is always kept under that paradigm.

When picture and sound are synced - by a kemroll method or other - and syncs made as one's edit sources it is those syncs which are then sorted into bins/groups by scene/location/character or other editing parameters.

The two systems run in parallel.

But I seem to have understood it requires timecode to work?
Yes, kemrolls do need T/C. In your case you have continuous run/time of day T/C on your Zoom (with careful management of battery changes). If you shot a T/C app display for each camera take you also have a record of timecode on video - that can be input into camera clips in LW - so now all your LW clips have T/C.

Make an A track kemroll - with empty space between events - of a Sound reel
Make a V track kemroll - with empty space between events - of a picture reel matching the above

Put the V and A kemrolls into an edit - manually line up the first event - it is now in sync. For subsequent events in the kemroll - even though the V and A timecodes may be wildly different - the time elapsed between events will be identical - so all subsequent events will be in sync - sure one may need a bit of tweaking to allow for device T/C drift.

Good if that's been supported by a clapper - can help a bit with tweaking - but in music shooting that's not always possible.

All one needs it to have a good understanding of how to ensure a 'reel' as being material recorded while the T/C source (Zoom or Phone app) has been in time-of-day/continuous-run

hugly wrote:
If you can find any editing workflow which uses timecode from a side car, please let us know.
.

That's too easy. Lightworks. a) always and b)By adding T/C to a clip in LW the .ed5 file is the 'sidecar' containing modifiable timecode metadata - and so much more.
BETA System
Microsoft Windows 7 Professional 64BIT
HP Z800 Workstation

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196560

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 6 months, 4 weeks ago
It's truly a pity that the ed5 file format hasn't been patented and standardized in the nineties. Lucas Film and Sony would have picked it up first, SMPTE would have made a standard out if it, Microsoft and Apple couldn't resist to integrate it in their media formats and probably the world of today would have ed5 as standardized sidecar file for media data exchange globally and Lightworks wouldn't have changed owner a few times since then.

It's better to travel well than to arrive...
Last Edit: 6 months, 4 weeks ago by hugly.

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196577

  • smyrf
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 45
  • 6 months, 4 weeks ago
hugly wrote:
Just to be clear, I don't say that timecode in a sidecar file doesn't exist or can't be used in principle, but I don't know any which is standardized for use with NLE's. It might well be that proprietary software exists which uses timecode in a sidecar file for proprietary workflows. So, if you hear about anything like this, it would be interesting to know.

If I understand correctly, the whole point here is that an extra step of transcoding is required in order to embed the timecode in the video file? Whether through an external utility, or in Resolve etc. In my case, getting Prores out of camera, transcoding would indeed be an extra unnecessary step, unless the utility can embed the timecode without transcoding/reencoding (but being metadata I suppose that should possible).

hugly wrote:
A common misunderstanding is that timecode will solve all sync problems in post. This article explains timecode in some detail: blog.frame.io/2017/07/17/timecode-and-frame-rates/

Proper adjustments, genlocked devices and a master clock connected to all? I don't know if the effort pays for the amateur producer. I think proper adjustments, some tests, and a clapperboard will do in most cases .

Thanks; I think I've come across that useful article before (and others from them) in my research.

Certainly for the kind of thing I'm doing now I can easily get by without timecode, but I'm mostly researching for the future where I plan on working on more substantial jobs.

Initially I thought it was required for any interchange between different software platforms; and while I don't see myself doing much (simultaneous, sound-synced) multi-cam stuff, it has already come up once (concert recording) but also for things like interviews where you want several angles of the same take. And in the case of concert recordings, you have two limiting factors: relatively long takes (depending on whether the camera is manned or not), and inability to clap a slate!

(The bigger issue I suppose it would address would be drift, with longer takes, although from what I've read it seems that with today's cameras - even at the prosumer level - it shouldn't be noticeable on anything with the typical shot length of most narrative work.)

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196578

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 6 months, 4 weeks ago
smyrf wrote:
If I understand correctly, the whole point here is that an extra step of transcoding is required in order to embed the timecode in the video file?

No, technically it isn't necessary to transcode in order to apply timecode, it can be injected into metadata of the media file itself by remuxing without re-encoding. One point was that SMPTE TC isn't supported in sidecar files, the second, we talked about BM Resolve and the Tentacle Sync software. I might be corrected on this, but as far as I know, both don't support remuxing in order to inject timecode.

There are NLE's which can convert LTC recorded on embedded audio tracks by converting to SMPTE internally. As mentioned earlier, a feature request exists for that and I can imagine that this feature will come sooner or later, since it's a cheap way to record timecode on low-budget cameras.

It shouldn't be noticeable on anything with the typical shot length of most narrative work.

Yes it shouldn't. A few tests or the first production will show.

Edit: To avoid confusion: LTC is standardized by the SMPTE organisation and it's just one of more possible representation of SMPTE timecode. The timecode is encoded as audio signal, so it can be recoreded by audio devices. NLE's which can handle it, decode the signal and convert the information into their own internal representation of timecode which are binary data.
It's better to travel well than to arrive...
Last Edit: 6 months, 4 weeks ago by hugly.

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196580

  • jwrl
  • Moderator
    Pro User
  • OFFLINE
  • Moderator
  • Posts: 11587
  • 6 months, 4 weeks ago
hugly wrote:
There are NLE's which can convert LTC recorded on embedded audio tracks by converting to SMPTE internally.

Yes, and Lightworks used to be one of them back when it ran on dedicated hardware. It also used to handle VITC. It shouldn't be particularly hard to implement LTC decoding, but the need for it is getting rarer and rarer. I doubt whether it would be a particularly high priority.
Last Edit: 6 months, 4 weeks ago by jwrl.

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196581

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 6 months, 4 weeks ago
I think there's some kind of renaissance of LTC. Not in the professional area, but on the consumer marekt. As soon as people start recording with more than one device, quickly the need for timecode will arise. With an LTC generator (e.g. Tentacls Sync) you can supply timecode to almost everything which is able to record audio, pocket cameras, iPhones, webcams, camcorders, screen recording software. Devices wich provide digital T/C connectors are expensive.
It's better to travel well than to arrive...

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196582

  • hugly
  • OFFLINE
  • Platinum Boarder
  • Posts: 21309
  • 6 months, 4 weeks ago
Or an iPad app as LTC generator and some cable connectors to the recording devices?

It's better to travel well than to arrive...

Re: Figured out multi-cam bins ... 6 months, 4 weeks ago #196584

  • smyrf
  • Pro User
  • OFFLINE
  • Senior Boarder
  • Posts: 45
  • 6 months, 4 weeks ago
RWAV wrote:
Yes, this is absolutely essential for keeping track of which piece of media mates with what.

but then also create other bins for scene/location, and mostly reference those while assembling an edit?
Yes, typically the reels/camera/days is for organisation of camera/recorder media and is always kept under that paradigm.

When picture and sound are synced - by a kemroll method or other - and syncs made as one's edit sources it is those syncs which are then sorted into bins/groups by scene/location/character or other editing parameters.

The two systems run in parallel.

Ok, it's all starting to make sense. The benefits of this sort of organised approach will certainly become more obvious with increasingly bigger projects.

RWAV wrote:
But I seem to have understood it requires timecode to work?
Yes, kemrolls do need T/C. In your case you have continuous run/time of day T/C on your Zoom (with careful management of battery changes). If you shot a T/C app display for each camera take you also have a record of timecode on video - that can be input into camera clips in LW - so now all your LW clips have T/C.

(Well for the time being it's all academic since I don't have the Zoom yet!) But what I don't understand is what utility the phone app has a source for T/C when it can't jam to anything? I could see how it'd be useful in syncing several video files, but video+audio, I don't see how that works when the phone is running independently. Or is it just a matter of then adding an offset based on the T/C on the audio device? Sorry this is still all very hazy for me at the practical level. I do realise the utility of having all your shots have consistent time-of-day T/C (albeit with an offset, in relation to the audio).

RWAV wrote:
Make an A track kemroll - with empty space between events - of a Sound reel
Make a V track kemroll - with empty space between events - of a picture reel matching the above

Put the V and A kemrolls into an edit - manually line up the first event - it is now in sync. For subsequent events in the kemroll - even though the V and A timecodes may be wildly different - the time elapsed between events will be identical - so all subsequent events will be in sync - sure one may need a bit of tweaking to allow for device T/C drift.

Wow .. I think you just came back from the past and answered my question above The first time I read this it didn't make sense to me at all; but having gone through the thought process to arrive at the above conclusion/question, this is now crystal clear. Don't you love those epiphany moments!
Last Edit: 6 months, 4 weeks ago by smyrf.
Time to create page: 0.54 seconds
Scroll To Top