When the article refers to 'graphics-arts cameras', what do they mean?
Is there a spectral notch filter applied to the CCD, or is the article a troll?
"Scanning in black-and-white makes it possible for the non-photo blue still to serve its original purpose, as notes and rough sketching lines can be placed throughout the image being scanned and remain undetected by the scan head."
Any black-and-white scanner should have a spectrally-flat response, picking up blue just as black and white photographs see the sky as darker than white.
It's entirely possible that older lithographic film didn't have much response in the blue, but there's really no way that a modern imaging system won't pick it up.
What am I missing?
Edit: Experiment is the arbiter of truth: I took a picture of the screen with my digital SLR. As expected, every color swatch in the article is blue. Desaturated the RAW image. Looks grey.
The goal was to compose a layout into a single image.
You created a layout by literally cutting and pasting things onto a board. Then you placed that board in the area at the bottom and took a picture of it that was transferred to film loaded in the top.
You're right that the film was special; but it's the other way around from how you were thinking. The film was not sensitive to red light. To this film, red is "black" and cyan or blue is "white".
Why this was useful:
- You could open the box of film (it came in sheets) in a room that was darkened except for a red bulb, without exposing it.
- You could use overlays of transparent red material (rubylith) to mask things precisely. Even though you could see through to the layer below, the camera would see it as all black.
- And, as the article mentioned you can add notes to the layout with blue pencil and it would be invisible to the transfer. We always called this "non-repro blue" though, as in, the camera wouldn't reproduce it.
-- Litho film was also very high contrast so everything pretty much came out black or white. (Photos weren't actually reproduced as greyscale but rather as a set of larger or smaller black dots using a halftone screen. This still applies when things are printed.)
-- Because litho film was sensitive to blue, the non-repro blue writing on the white paper would, like the white itself, be an exposed part of the image. This results in a black area of the negative where silver halide has been turned into metallic silver. This black area would then become white again when the negative was used to create a printing plate.
Yeah, I expect that's just someone with a mania for Wiki-standardization. It's not a precise shade; any cyan-ish color would do. In practice non-repro pencils and markers varied from sky blue to a rich turquoise.
The article seems confused - it's implying that there is some magic shade of blue that cameras can't see (even today), which is totally wrong. I think that's why someone found it interesting to post here.
Graphic arts film wasn't at all fussy about the shade of blue (as you note) and so while there were expensive non-repro blue markers and pencils, everyone I knew (at the very end of the era of graphic arts cameras) used blue highlighters, so design studios were full of them.
I've stuck with blue as the only highlighter colour I'll ever use, more than 20 years since the original rationale.
Also, used to freak people out scribbling (non-repro) obscenities on a flat that was going to be sent to photo and turned into a newspaper the next day.
Especially since an sRGB triplet only specifies how to perceptually reproduce the color, and film has a different spectral response from the human eye. The dye in non-photo blue probably should actually spectrally be in the blue range rather than having any dye in the red or green range, since it would likely show up on film otherwise.
I believe they are referring to a technology of the ancients where they made thin films of photosensitive chemicals, exposed them to light, then processed them to make images. The chemicals varied in which wavelengths would activate them.
For instance, red would not activate the paper commonly used for black and white prints, hence the red lights in dark rooms.
It is also possible the cameras illuminated the artwork with a light to which the non photo blue ink was transparent.
The magic word here is "orthochromatic". Orthochromatic photo emulsions (the light-sensitive part of film or photo paper) are only sensitive to short wavelengths of light. The first photo emulsions were all orthochromatic, which makes skin look weird. Later we developed Panchromatic film which is equally sensitive to all colors. It replaced ortho in the camera, but ortho continued to be very useful in the darkroom and in compositing because it allows the red safelight and tricks like non-photo blue.
Not necessarily. Orthochromatic ("correct colour"), or ortho, materials were actually improved-spectrum materials that were sensitive well into the yellow-green. Prior to that, film and paper were really only significantly sensitive to blue/ultraviolet or "actinic" light. Getting to panchromatic ("all colours") was indeed significant, but ortho was advanced technology at the time. (And yes, being able to see what you were doing in the process room was a Good Thing™. Also, rubylith for masking.)
It's a real thing: in ancient times when I worked on a yearbook staff, we used non-photo blue markers to mark up the physical pages we sent to the publisher.
I don't know how these pre-digital reproduction systems excluded the blue, nor to I know if this system is still in use in the digital era.
The article does mention diddling with the contrast and brightness as well as desaturating it. However, it doesn't give references to digital-based workflows working like this. I associate non-repro blue grids and pens with doing physical paste-up on a light table. I wouldn't think they'd be part of a typical digital flow although someone in that business would know better than I.
[Edit: As someone wrote, the article just seems confused. Yeah, you can adjust a digital B&W image so that a light blue goes away. You can also adjust it so a light yellow or a light anything goes away. Digital sensors do have different wavelength sensitivities but the use of non-repro blue and rubylith were a function of the specific sensitivities or lack thereof of litho film.]
I mean that at that time non-digital photography hadn't completely died out. I recall watching a TV show comparing digital and film photography's quality.
Is there a spectral notch filter applied to the CCD, or is the article a troll?
"Scanning in black-and-white makes it possible for the non-photo blue still to serve its original purpose, as notes and rough sketching lines can be placed throughout the image being scanned and remain undetected by the scan head."
Any black-and-white scanner should have a spectrally-flat response, picking up blue just as black and white photographs see the sky as darker than white.
It's entirely possible that older lithographic film didn't have much response in the blue, but there's really no way that a modern imaging system won't pick it up.
What am I missing?
Edit: Experiment is the arbiter of truth: I took a picture of the screen with my digital SLR. As expected, every color swatch in the article is blue. Desaturated the RAW image. Looks grey.