It's official; Film is Cool

Some of us are passionate about film photography, in which the process is part of the enjoyment as much as the end result.

I love digital technology, but I don't find the pleasure when I work on photos in Lightroom or PS compared to when I am in the darkroom.

But digital is not evil, it's cool even! :)

I went to school for photography in the 90s. We didn't do any digital at all (too early). I'm very familiar with printing B&W, C-prints, Cyanotypes, and Cibachromes, etc. So, just for the record, I am not someone who has only used digital. I still use film a little bit even now (because I enjoy the old cameras).

I understand the fundamental difference between the processes (and the fun factor)... but I think too much is put into the "mystical" side of photography where film supposedly has "soul" and digital supposedly doesn't. When some people look down on you for using digital... it's just silly.

If I was in school these days and the program didn't teach digital, I would feel I was missing something. However, I believe it is important to teach both. I just don't feel learning from film is any better than learning from digital.
 
No, there is far more than history to be learnt. A 10x8 camera can be had for peanuts. A digital back for a large format camera can't. Film makes entry into the marketplace far cheaper for people starting out with little money.

How many people are getting started with an 8x10 camera these days (very few)?

How much is 8x10 film (which if learning with, you will waste a lot of to get very little)?

How much does film, chemicals, a scanner, darkroom time or equipment, printing paper, etc cost (a lot)?

I completely agree that these processes / formats should be taught in a photography program, but it is no cheaper than digital in the long run. Believe me, I've been through it. Both are money suckers.
 
Well, I'm not so sure it has more to offer... I think they just think it is an important process to learn from a historical stand point. Digital allows one to see their mistakes a lot quicker... I would think that is beneficial for learning.

I think (and the current pedagogical thinking is) that being able to see and correct mistakes with little effort or thought about them, is bad for learning. It promotes an iterative approach rather than separating analysis and planning from doing. That approach should be used of course, but over a longer time scale so that the lessons learned have time to marinate.

Take exposure and judging the highlights and shadows for example. With digital, one shoots and chimps until the specular highlights are gone, usually by evaluating "blinkies" on the back of the camera. Thus rote mechanisms are learned. With metering, developing and printing, one has time - and the drive - to analyze meter readings and expose accordingly. That way, even when the student becomes an accomplished digital shooter, they will have learned how to minimize the time spent looking at the back of the camera and adjusting the controls, in a trial and error style.
 
I'm looking forward to image samples of Sigma's new $10,000 DSLR. The sensor on that camera is supposed to be the most film-like digital sensor to date and Sigma is using that very 'film-like' quality as a selling point of this camera, using the exact words 'film like'... Now, if camera makers are selling their DSLRs with insane prices because its 'film-like' then its hard not to conclude where the standard is, where digital wants to go and where the demand is for quality... I'm sure not all of the marketing people in Sigma are on some drug to price their camera $10,000 if there was no plausible reason to do so. But as I said we'll have to wait for the image samples and until then get the same film-like look by actually using, film...
 
I'm looking forward to image samples of Sigma's new $10,000 DSLR. The sensor on that camera is supposed to be the most film-like digital sensor to date and Sigma is using that very 'film-like' quality as a selling point of this camera, using the exact words 'film like'... Now, if camera makers are selling their DSLRs with insane prices because its 'film-like' then its hard not to conclude where the standard is, where digital wants to go and where the demand is for quality... I'm sure not all of the marketing people in Sigma are on some drug to price their camera $10,000 if there was no plausible reason to do so. But as I said we'll have to wait for the image samples and until then get the same film-like look by actually using, film...

I would conclude that, with high-end, film-like digital, one is paying only for speed and convenience. In some professions, that is worth the pricetag. As a photo-enthusiast ... not so much.
 
I would conclude that, with high-end, film-like digital, one is paying only for speed and convenience. In some professions, that is worth the pricetag. As a photo-enthusiast ... not so much.

Even in those professions, its just a little too much of a stretch to dish $10,000 for a DSLR that surly is not gong to outperform MF digital or even FF digital.

But in case of Sigma I think they're just marketing their sensor and hoping they'll get a contract on it from other camera makers or who knows - selling the Foveon division to Sony.
 
I think (and the current pedagogical thinking is) that being able to see and correct mistakes with little effort or thought about them, is bad for learning. It promotes an iterative approach rather than separating analysis and planning from doing. That approach should be used of course, but over a longer time scale so that the lessons learned have time to marinate.

Take exposure and judging the highlights and shadows for example. With digital, one shoots and chimps until the specular highlights are gone, usually by evaluating "blinkies" on the back of the camera. Thus rote mechanisms are learned. With metering, developing and printing, one has time - and the drive - to analyze meter readings and expose accordingly. That way, even when the student becomes an accomplished digital shooter, they will have learned how to minimize the time spent looking at the back of the camera and adjusting the controls, in a trial and error style.

I can see your point, though I wasn't really thinking of chimping, it is inevitable I guess. However, back in the day, by the time I developed my film and got around to printing, all technical stuff was forgotten. ;)

I think it is up for debate if one needs time for things as simple (and as complex) as metering to marinate. People lear at different speeds. Also, I think too many assume that just because one uses a digital camera, they don't know about exposure, shutter speeds, apertures, etc. I would argue that proper exposure is way more important on digital. Not all digital shooters even use the histogram.

And let's remember that photography is not all technical... so learning is about seeing as well...and the quicker one can see results is the quicker one can see their conceptual / compositional mistakes and learn from them.
 
Last edited:
NCPS do you eat that with a spoon or a fork? Let's hope -for my sake, that the 'cool' 35mm film hype results in sustained film production at lower prices. I am not optimistic.
 
I went to school for photography in the 90s. We didn't do any digital at all (too early). I'm very familiar with printing B&W, C-prints, Cyanotypes, and Cibachromes, etc. So, just for the record, I am not someone who has only used digital. I still use film a little bit even now (because I enjoy the old cameras).

I came from the opposite direction :)
Having learned photography with digital, I find the hands-on film processes far more rewarding than any pixel-pushing I've done. And just to set the context, I've been paid to push-pixels, not just toying around personally.

I understand the fundamental difference between the processes (and the fun factor)... but I think too much is put into the "mystical" side of photography where film supposedly has "soul" and digital supposedly doesn't. When some people look down on you for using digital... it's just silly.

The question is, how many people still do that? Compare that with the number of photographers who are completely oblivious to the benefits of film and its processes due to the lack of mention.

Digital is here and it's here to stay. It will continue to improve and mature as a medium of photography. In the eyes of many, it *is* the only way to photograph.

To me, the danger of losing those historic film processes due to ignorance and bias *far* outweighs some bruised ego because someone looked down on you (= you in general, not jsrockit :) ) for using digital.


If I was in school these days and the program didn't teach digital, I would feel I was missing something. However, I believe it is important to teach both. I just don't feel learning from film is any better than learning from digital.

It is debatable whether learning is better with film or digital, I don't have a say because I learn using both, and I don't teach photography classes.

I agree with you teaching both is the best approach. Many film-based processes already benefit from digital technologies (digital negatives, scanning vs contact printing, digital proofing, etc.)
 
How many people are getting started with an 8x10 camera these days (very few)?

How much is 8x10 film (which if learning with, you will waste a lot of to get very little)?

How much does film, chemicals, a scanner, darkroom time or equipment, printing paper, etc cost (a lot)?

I completely agree that these processes / formats should be taught in a photography program, but it is no cheaper than digital in the long run. Believe me, I've been through it. Both are money suckers.

Barring the rare case of people learning photography with a Deardorff, today there are still plenty of people who will learn with that "old crummy camera" their dad, mom, relative, uncle, or buddy haven't used since 2003 when they bought a Rebel XTi.

So the entry cost is absolutely zero in their case. They will shoot drugstore film, get 1-hour prints, and look at their work, and learn.

Eventually, when the long run begins to happen, they will make a choice with respect to how they invest their money: expensive one-time expense for long-lasting material v. regular small expenses for the benefit of flexibility.

People start photography step by step. As much as they rarely start by buying an 8x10, seldom will they begin by plunking down $5,000 on a full-blown FX kit.

If they are serious about photography, they will eventually face that decision. But it remains that many people get their first chance at photography with a free hands-me-down camera, usually a 35mm SLR.

Over the coming years, there will be more and more hands-me-down digital cameras, but in the meantime, film cameras in drawers and wardrobes still represent a critical mass of free gateway drugs to the wonderful world of photography.
 
Over the coming years, there will be more and more hands-me-down digital cameras

I would seriously doubt that. Digital cameras will more than likely suffer from obsolescence, and eventually trying to use even your £3000 EOS 5D will require even more jumping through hoops than trying to use a Pentax 110 Auto today. It's not impossible, of course - I seem to remember reading a post around here about antiquated computer setups - but in 20 or 30 years time it won't be anything like as convenient as passing on an old 35mm Praktica SLR is today.
 
...
It is debatable whether learning is better with film or digital, I don't have a say because I learn using both, and I don't teach photography classes.

I agree with you teaching both is the best approach. Many film-based processes already benefit from digital technologies (digital negatives, scanning vs contact printing, digital proofing, etc.)

The best programs do teach both. Traditionally photography programs begin with a film course, and then use digital for color and commercial classes. Over the last five years however students have been demanding an all digital program, since they are less interested in the artistic use of photography, and seek to use it as a career.

At my school this has lead to a two year certificate in digital photography. Note however that a basic course in film (shooting, processing and printing) is required for that certificate, but that is all the film content that is required. Our degree programs still require at least two to four film based classes, in addition to several digital classes.

For the last two years, we have been integrating film and digital into the first two photography classes. We have greater enrollment in those classes, but also more confusion and more discontinuity in the students work for their first year. This however, is not necessarily a bad thing.
 
Well, film was really cool from the start. Remember when you first said "ooohh" or "ahhh" from your first successful development? Not to mention the interest of those who see your negatives!

Digital can't let out those oohs and ahhs... unless your chimping!:D
 
It's ironic, isn't it; when you're faced with the shot of a lifetime (Lord Lucan riding a Yeti against the elusive green flash of a tropical sunset) you have to go fully manual. Yes, you can make errors, but at least they're your errors and not the errors of a computer programmer in Tokyo.
 
It's ironic, isn't it; when you're faced with the shot of a lifetime (Lord Lucan riding a Yeti against the elusive green flash of a tropical sunset) you have to go fully manual. Yes, you can make errors, but at least they're your errors and not the errors of a computer programmer in Tokyo.

Hi,

Well, I dunno; but some of us still put our digital cameras and our modern film ones back in the case on the "P", "AF" and auto anything else setting just so that we can whip it out ready for grab shots. And didn't we put older ones back in the case at 1/100, f/5·6 or f/8 and focused on whatever it was? I wish I could remember; it shows how auto everything* is destroying the old crafts...

Regards, David

*For example, if we were talking face to face I wouldn't have to keep logging in every five minutes.
 
Is that automomy's fault or our own?
It is possible to utilise automony to improve on something that did without it.
 
Is that automomy's fault or our own?
It is possible to utilise automony to improve on something that did without it.

Doesn't this ignore the very real human trait of lassitude? - If the machine is gonna handle it mostly okay then I can ignore it.

Our fault to be sure, but hey. We had already worked out the solution. Why relearn those lessons if we needn't? This isn't to say get rid of automation, but don't lose the personalized control either.
 
Last edited:
Back
Top