PREPARED BY THE ACADEMIC ASSISTANCE TEAM, DEPARTMENT OF JOURNALISM
UNIT I
• Sound-scape, Sound culture
• Types of sound-Sync, Non-Sync, Natural sound, Ambience Sound
• Microphone- Different kinds of microphones (dynamic, condenser, directional
microphones)
• Basics of Sound Design
• What is a visual?
• Visual Culture in media studies
• Politics of an image
• Ecology of image
UNIT II
• Basics of a Camera- (Lens & accessories)
• Camera Movement
• Types of shots, Focusing, Depth of field, Lighting
• Visual Perspective
UNIT III
• Electronic News Gathering (ENG) & Electronic field Production (EFP) (Concept)
• Elements of a Television News Story: Gathering, Writing/Reporting.
• Elements of a Television News Bulletins
• Basics of Editing for TV- Basic Softwares and Techniques (for editing a news capsule)
Soundscape:-
Sound can relay meaning, emotion, memory, and facts through language, music, and field recordings. Sound, when understood as an environment, is a soundscape: a powerful tool that helps humans relate to their surroundings. They can be consciously designed by an individual or group of individuals, or the byproduct of historical, political, and cultural circumstances.
They may be musical compositions, ethnographic anthropological field recordings, recordings of a rainforest taken by an ecologist, or imaginings of a sound designer/historian ruminating upon the sounds of the past.
The things that make the soundscape of a place different from any other place in the world are soundmarks. The soundmark of your home may be a small fountain bubbling away in the corner of your living room, or the tinkle of a wind chime in your backyard. You like these sounds because they make you feel a certain way, and now they color everyone else’s impression of the sonic environment of your personal space.
Power Of Soundscape:-
All members of the community may have individual reactions to the greater soundscape. This contrast between the soundscape at the micro and macro levels helps paint a picture of the diverse makeup of a
community. This is the power of the soundscape.
The study of soundscape:-
It is the subject of acoustic ecology. The idea of soundscape refers to both the natural acoustic environment, consisting of natural sounds, including animal vocalizations and, for instance, the sounds of weather and other natural elements; and environmental sounds created by humans, through musical composition, sound design, and other ordinary human activities including conversation, work, and sounds of mechanical origin resulting from use of industrial technology. The disruption of these acoustic environments results in noise pollution.
Sound Culture
The sound culture is the auditory environment (or soundscape) located within its wider social and cultural context. The concept of a sound culture (also called an auditory or aural culture) is directly connected to the soundscape.
What are the broader social and cultural influences on what we hear in our everyday lives?
Social and cultural organisations are largely responsible for the sound landscape that we inhabit and these inevitably change over time.
In the pre-industrial European World one of the defining features of the soundscape was the tolling of the church bell. It told the workers in the field of the progress of their day’s toil but was also an auditory marker of the community briefly enveloped in the sound of the bell. It also reminded those who heard it of the centrality Of the church in their lives.
The bell's ring was part of the soundscape but the social and religious dimensions, which adds meaning to the sound, are also part of the wider sound culture. Industrialisation created a very different soundscape, the soundscape of modernity. The cities became unprecedentedly loud; ‘the din of modern technology: the roar of elevated trains, the rumble of internal combustion engines, the crackle and hiss of radio transmissions’.
These sounds, including jazz, were part of the particularly rich and complex sound culture of America’s African population living in the ghettos. So much of contemporary sound culture is the sound of electronic media; the personalised media of the phone and the iPod. Music, the radio and television at home, recorded sound in the shops, pub and nightclub.
TYPES:
Sync sound
Sync sound (synchronized sound recording) refers to sound recorded at the time of the filming of movies. It has been widely used in movies since the birth of sound movies.
Even in the silent film era, films were shown with sounds, often with musical accompaniment by a pianist or an orchestra keeping time with the screen action. The first synchronization was a turning recording device marked with a white spot. As the white spot rotated, the cameraman hand-cranked the camera to keep it in sync with the recording. The method was then repeated for playback, but with the projectionist hand cranking the film projector.
Example: a shot of a river with rushing water sounds, and punching sounds during a fight scene are all examples of synchronous sound.
Non Sync sound
Nonsynchronous sound can also be ambient: nonsynchronous sound is the product of postproduction technicians determining the emotional and intellectual impact of a certain scene through sound. At the simplest level, music is used to determine the audience's response to a particular moment.
Example: Nonsynchronous sound is any noise whose origin you can't see: that gunshot in the dark that almost hits the hero.
So the difference between synchronous vs asynchronous sound in film is the visibility of the corresponding action taking place on camera when the sound is made. When the sound corresponds with an action on screen, it is synchronous. When the sound does not correspond with an action on screen, it is asynchronous.
Natural Sound
Natural sounds are any sounds produced by non-human organisms as well as those generated by natural, non-biological sources within their normal soundscapes. Natural sounds create an acoustic space .
Ambient sound (background noise)
Means the background sounds which are present in a scene or location. Common ambient sounds include wind, water, birds, crowds, office noises, traffic, etc.
Ambient sound is very important in video and film work. It performs a number of functions including:
● Providing audio continuity between shots.
● Preventing an unnatural silence when no other sound is present.
● Establishing or reinforcing the mood.
What is sound design?
Sound design is the craft of creating an overall sonic palette for a piece of art, especially media like film, TV shows, live theater, commercials, and podcasts.
6 applications for sound design.
● Film. Film sound design mostly involves creating sounds that mimic real life. If a scene takes place in a diner in the 1970s, the sound designer might combine the sounds of a kitchen, the clinking of glassware, and some era-specific music drifting from the jukebox.
● Television. The duties of a TV sound designer are pretty close to those of a film sound designer. The one main difference with TV is that many shows have episodes that return to the same location time and again. As a result, a TV sound designer might design a core template for scenes shot in those locations so they can create consistency from one episode to the next.
● Advertising. If you listen closely, you can hear all kinds of sound effects in TV and radio commercials. Sound designers create sonic palettes that transport audiences to the various worlds where those ads take place. The goal of most commercial sound design is to blend in without being distracting. In most cases, you should barely notice the sound.
● Music. You can find examples of sound design in music recordings.
● Podcasts. As a purely sound-based storytelling medium, podcasts require careful attention to sound design. This particularly applies to narrative podcasts, where the right sonic textures can transport a listener into the world of the story.
● Live theater. Sound designers make immense contributions to live theater. Theater sound design may include sound effects, pre-recorded voiceover, and music coming from an onstage radio or television.
6 elements of sound design
● Voice-over. Voice-over is pre-recorded audio, typically provided by one of the actors in a production. It can serve as a narration or come from an off-stage (or off-screen) character.
● Ambiance. Ambiance describes the sonic tapestries created by a sound designer to create a sense of time and place.
● Foley sounds. Foley involves using physical objects to create sound effects, such as using a pair of coconut shells to mimic the sound of galloping horses.
● Audio effects. Audio effects consist of various sounds related to specific objects — like a ringing telephone, a firing gun, or a revving motorcycle. They can also describe standalone audio cues that layer on top of ambient soundscapes.
● Music. Sound design often involves music — both preexisting music licensed for a production and original music created specifically for the project.
● Live microphones. In live theater productions, a sound designer is often in charge of sound reinforcement, which involves miking actors and projecting their voices through a theater’s speakers.
MICROPHONES
A microphone, called a mic or mike, is a transducer that converts sound into an electrical signal. Microphones are used in many applications such as telephones, hearing aids, public address systems for concert halls and public events, motion picture production, live and recorded audio engineering, sound recording, two-way radios, megaphones, and radio and television broadcasting. They are also used in computers for recording voice, speech recognition, VoIP, and for other purposes such as ultrasonic sensors or knock sensors.
Dynamic
Dynamic microphones are microphones that convert sound into an electrical signal by means of electromagnetism. They fall into two categories, moving coil and ribbon microphones.
MOVING COIL MICROPHONES
Moving coil microphones are probably easiest to understand, because they are basically built like a loudspeaker: A coil is glued to the rear of a membrane, and there is a strong magnet surrounding this coil. When sound waves hit the microphone, the membrane moves to the rhythm of the sound waves, and the coil on its back moves along with it. The relative movement of the coil within its (stationary) magnetic gap induces a small signal voltage in this coil.
Moving coil microphones are often preferred for use on stage, because they are quite sturdy and do not require external power. In the studio, engineers usually prefer condenser or in some cases ribbon microphones, which are less robust but offer superior sound reproduction.
RIBBON MICS
Ribbon microphones work by the same basic principle of electromagnetic induction. However, instead of having a membrane and a coil, a ribbon transducer uses a narrow strip of extremely thin aluminum foil. Such a thin piece of aluminum ribbon is much lighter than a membrane with a coil of copper wire attached to it. A ribbon transducer is therefore able to follow the movements of the sound waves more accurately than a moving coil capsule.
However, with just one conductor inside the magnetic gap instead of a whole coil of wire, it also produces much, much lower output.
By nature, ribbon mics are bidirectional, i.e. they are equally sensitive to sound coming from the front and sound coming from the rear. But sound waves coming from the sides do not set the ribbon in motion. This pickup pattern is called figure-8.
Ribbon mics are very fragile and must be treated with great care.
Condenser
While dynamic microphones are popular for stage use, due to their rugged construction, condenser microphones have always been the preferred type for studio recording. The British call them “capacitor microphones” – and for a reason, too. You may remember from physics class that a capacitor is essentially two metal plates in close proximity. The closer they are, the higher the capacitance.
A condenser capsule is constructed similarly. It consists of a thin membrane in close proximity to a solid metal plate. The membrane or diaphragm, as it is often called, must be electrically conductive, at least on its surface. When sound waves hit the diaphragm, it moves back and forth relative to the solid backplate. In other words, the distance between the two capacitor plates changes. As a result, the capacitance changes to the rhythm of the sound waves.
The condenser capsule’s output voltage is actually quite high, but it produces almost no current, because so little energy is stored in this small capacitor. Condenser microphones therefore require external power. Condenser microphones, therefore, offer superior sound quality. Also, condenser microphones usually offer much higher sensitivity (i.e. output) and lower noise than dynamic microphones.
Directional
Directionality is a microphones sensitivity to sound relative to the direction or angle from which the sound arrives. There are a number of different directional patterns available, and these are plotted in graphs known as polar patterns.
The three basic directional types of microphones are omnidirectional, unidirectional, and bidirectional.
- Omnidirectional Microphones
Omni mics are equally sensitive to sound arriving from all angles. Omni mics have the advantage of sounding particularly open and natural when compared to unidirectional or bi-directional alternatives. This attribute makes them a great choice in studio environments with good acoustics, or live applications where the stage volume is low. The disadvantage is obviously the lack of directionality; if you need to reject background noise, room ambiance, or monitor feedback on-stage - omni's are not a suitable selection.
- Unidirectional mics
Unidirectional mics are most sensitive to sound arriving from directly in front – the angle referred to as 0 degrees – and less sensitive in other directions.
Applications: Live sound, Studio recording (particularly in less than ideal acoustic environments)
- Bi-directional (figure-of-eight)
The coverage or pickup angle is only about 90 degrees at both the front and the rear. This mic could be used for picking up two opposing sound sources, such as a vocal duet, for example.
BASICS OF SOUND CONCEPT
INTRODUCTION:
A sound is a vibration that propagates through a medium in the form of a mechanical wave.
Only acoustic waves that have frequencies lying between about 20 Hz and 20 kHz, the audio frequency range, elicit an auditory percept in humans.
Sound in film and television is meant to support the story of a narrative, documentary, or commercial film or television program. Sound may tell the story directly, or it may be used indirectly to enhance the story effects of the movie.
Although there are separate perceptual mechanisms for sound and picture, the sound may be integrated by the audience along with the picture into a complete whole, without differentiation. In such a state, the sound and picture together can become greater than the sum of the parts. In most instances, film and television sound is constructed in post production utilising many pieces of sound, mixed seamlessly together to create a complete whole.
PROPERTIES OF SOUND:
1.Acoustics:
Acoustics is a term referring to the qualities that determine a room’s ability to reflect sound waves in such a way as to produce distinct hearing.
Acoustic properties of sound:
Loudness -It is the volume or level of the sound. Intensity is measured in decibels (dB).
Pitch - It is the perceived "highness" or "lowness" of the sound. Pitch is the characteristic of sound by which an acute (or shrill) note can be distinguished from a grave or a flat note.
Timbre - is the quality of a musical note or sound or tone that distinguishes different types of sound production, such as voices or musical instruments.
Rhythm - is a sound's rhythmic qualities, or the pulse, the beat. Time is a factor here, as rhythm can be fast or slow. The speed of rhythm is its TEMPO.
2. Recording:
Sound recording and reproduction is the electrical, mechanical, electronic, or digital inscription and re-creation of sound waves, such as spoken voice, singing, instrumental music, or sound effects. It is the transcription of invisible vibrations in air onto a storage medium such as a phonograph disc. The two main classes of sound recording technology are analog recording and digital recording.
ANALOGUE RECORDING
The Analog recorder uses microphone which converts sound waves into electrical analog signal. Analog signal can be recorded onto cassette tapes and LP vinyl records.
The Digital recorder also uses microphone to tap the analog signal and converts this acquired analog signal into digital form (i.e. series of numbers). It can be recorded onto CD (Compact Disc), HD (Hard Drive), SSD, memories (RAM/ROM), optical drive, streamed live on internet, stored on computers or servers or mobile phones etc
Microphones – are a type of transducer - a device that converts energy from one form to another.
3. MIXING:
Audio mixing is the process by which multiple sounds are combined into one or more channels. In the process, a source's volume level, frequency content, dynamics, and panoramic position are manipulated or enhanced. This practical, aesthetic, or otherwise creative treatment is done in order to produce a finished version that is appealing to listeners. Audio mixing is practised for music, film, television and live sound. The process is generally carried out by a mixing engineer operating a mixing console or digital audio workstation.
4. SOUND EFFECTS:
A sound effect (or audio effect) is an artificially created or enhanced sound, or sound process used to emphasise artistic or other content of films, television shows, live performance, animation, video games, music, or other media. Traditionally, in the twentieth century, they were created with foley.
Foley – is the reproduction of everyday sounds for use in filmmaking. These reproduced sounds can be anything from the swishing of clothing and footsteps to squeaky doors and breaking glass. The best
Foley art is so well integrated into a film that goes unnoticed by the audience. It helps to create a sense of reality within a scene.
Decay – Decay is the fade out of the reverberation of a sound. In other words, the decay refers to how the sound fades away in terms of a quick decay, or gradual.
Walla – A sound effect for the murmur of a crowd in the background. Walla is often used as subliminal aural communication and sets a mood or a tone.
SOUND BITE:
A sound bite is a short clip of speech or music extracted from a longer piece of audio, often used to promote or exemplify the full-length piece.
Due to its brevity, the sound bite often overshadows the broader context in which it was spoken, and can be misleading or inaccurate. The insertion of sound bites into news broadcasts or documentaries is open to manipulation, leading to conflict over journalistic ethics.
Categories of sound in Film:
Dialogue – Cinematic dialogue is oral speech between characters
Voice-over – (also known as off-camera or off-stage commentary)
The voice-over may be spoken by someone who appears elsewhere in the production or by a specialist voice actor. Voice-overs are often used to create the effect of storytelling by a character/omniscient narrator.
Music – is an art form whose medium is sound.
What is a Visual?
Visual journalism is the combination of text, images and layout. Visual journalism is a much wider concept than photojournalism. It is essentially creating the overall visual appearance of a newspaper, magazine, TV-channel or so on.
Visual includes:
● editorial cartoons,
● infographics- charts, graphs, maps, and diagrams.
● news design- News design is the process of arranging editorial material on a newspaper page, and the resulting arrangement include layout, makeup, and pagination.
● photojournalism
Visual journalism can effectively help our audience to understand a story better. They say a picture can be worth a thousand words and that's certainly true. A simple map or graphic can really convey a story in a visual way that can be immediately grasped.
Visual culture is concerned with the production, circulation and consumption of images and the changing nature of subjectivity.
VISUAL GRAMMAR
Elements
● Objects – The basic elements we have to work with. Can be abstract or concrete.
● Structures – The patterns formed from our basic elements. Can be abstract or concrete.
● Activities – The processes we can represent with our basic elements and patterns.
● Relations – The relationships between objects, patterns, and processes. They’re the way everything in your design relates to each other and the viewer.
Various types of Visual Grammar
1. Points – A position on a coordinate system. For eg- if we connect two points we get a line, and a triangle if three points are connected.
2. Volumes – An empty space defined by surfaces, lines, and points. Volumes have three dimensions.
3. Size – It can be large or small. They are perceived relative to the person viewing, other forms in the composition, and the format of the design.
4. Color – we perceive different wavelengths of light as color. A form can be any color, though we are limited to seeing only those colors in the visual spectrum. Imagine the traffic lights for instance. They’re just colours but we learn that red means stop, green means go and yellow means step on the metal because you can make it before it turns red. Every channel has their own colour.
5. Typography- Typography is how your words look like.
6. Balance, Rhythm and Contrast- When the visual weights of objects in a composition are in equilibrium, the composition is in balance. For eg- TV anchors are dressed in an appropriate manner relating to their role in the society. Every news programme, soap opera has a musical connotation.
7. Space- Space is the empty areas between major and minor objects and is as important to a design as the areas where the space is filled with objects. Learning to see and use the space in your designs is one of the most important things any designer can learn. You need to consider how each element/letter relates to each other, give them the precise breathing room they need, this is usually referred to as negative space (positive space are the actual letters).
8. Weight- All objects in a design have a visual weight based on size, color, form, surrounding space, etc. Our eye is drawn to elements with the greatest visual weight.
9. Background/ Foreground- The position of objects, their relative weights, the space around them and, their relative proportions all contribute to which elements are seen as being in the foreground and which are seen as part of the background.
10. Grid and alignment- Grid is mostly hidden but you'll see them if you open a book or a newspaper, but (no matter what you’re designing) following a grid will structure your design and make it more pleasant and easier to digest. Alignment is especially important with text, there are several ways to align it but my rule of thumb is to align it left. It always depends on what and for whom you’re designing of course, but generally, people read from left to right, top to bottom, which makes text that is center or right aligned much more difficult to read.
11. Frame- Try to direct the eye to what matters, crop/frame images to make your subject stand out or to reinforce your message. It’s all about telling the right story and telling it well.
12. Visual concept- This is the idea behind your design. What do you mean with it and what’s the deeper meaning behind the superficial image. This is what distinguish a great design from something you can download from a stock website. People say don’t judge a book by its cover but that is always done.
What is visual culture?
Very broadly, visual culture is everything that is seen, that is produced to be seen, and the way in which it is seen and understood. It is that part of culture that communicates through visual means. According to Nicholas Mirzoeff, visual culture can be defined as "visual culture is not just a part of out life, it is our everyday life".
Visual culture is perhaps best understood as a tactic for studying the functions of a world addressed through pictures, images and visualisation, rather than through texts and words.
Visual culture studies
Visual culture is a growing interdisciplinary field of study, which emerged out of the interaction of anthropology, art history, media studies and many other disciplines that focus on visual objects or the way pictures and images are created and used within a society.
Visual culture is concerned with the production, circulation and consumption of the images and the changing nature of subjectivity.
Importance of Visual Culture
Our experiences are more visual and visualized than ever before. In the era of the visual screen, our viewpoint is crucial. For most people, life is mediated through television, film and the internet.
The visual culture approach acknowledges the reality of living in a world of cross mediation.
Images often move across social arenas from documentary images to advertisement to amatuer videos to new images to artworks. Each change in context produces a change in meaning. A single image can serve multitudes of purposes, appear in a range of settings, and mean different things to different people.
Decoding an Image
We decode, or read, complex images almost instantly, giving little thought to our process of decoding. We decode images by interpreting clues to intended, unintended, and even suggested meanings. These clues may be formal elements of the image, such as colour, shade and contrast, or the socio-historical context in which it is presented.
All images have two levels of meaning:
The denotative meaning of the image refers to it's literal descriptive meaning.
The connotative meanings rely on cultural and historical context of the image and its viewers.
What all is included in visual culture?
Painting
Sculpture
Installation
Video art
Digital art
Photography
Film
Television
The internet
Mobile screenic devices
Fashion
Architecture and urban design: Social spaces of museums, galleries, exhibitions, and the other private and public environments of the everyday.
Representation: Representation refers to the use of images to create meaning about the world around us. These systems have rules and conventions about how to express and interpret meaning.
Visual literacy: Visual literacy has no limits. It is not just the understanding of canonical fine art, or the business of adverty, but also the entire visual world. Visual culture studies provide you with the ability to analyse the visual world.
Conclusion
The future of visual culture in everyday life is deep rooted like development of internet & high definition TV , make clear that visualizing is here to stay. Analyzing visual culture is a useful tool in understanding more about the world in which we live in. One may see things in different aspect or an image might take upon new meaning, once its meaning is analyzed and truly understood.
POLITICS OF AN IMAGE
Introduction
Human culture is a visual culture. From cave paintings to selfies, we have always used images to tell stories about our lives, experiences and understanding of the world.
These images are particularly potent when they not only depict, but instruct us about social norms – when they shape attitudes and behaviour on everything from the role of women to ideas about nationhood.
But the idea that a picture “never lies” is a powerful – and inaccurate – adage. For they do not always tell the whole story. And the fact that images may be strategically constructed, manipulated or chosen carefully to convey an impression, can often go unnoticed by the people looking at them.
The beauty of photograph is that it reinvents and periodizes the event and has the same intensity as the of real time event triggering the response and reaction as of the same happening.
Reasons for the Development
The reasons for the development of politics in an image are obvious: footage of the World Trade Centre attacks and photos from Abu Ghraib and Guantanamo (to give just a few examples) have clearly demonstrated that images not only respond to political events, but also play an important part in shaping them. However at times, the visual of a rape victim, on the electronic media, the NET or in print, is likely to raise the hackles of the common man and woman sitting in front of his/her television set watching the news. So, when images are used to perpetuate a given message to a wider public, it is perhaps logical. But does it protect the privacy and dignity of the person whose image we are choosing to project without the permission of the said person, living or injured or dead? This is what politics of image is all about, the perspective that is attached to the image by the media.
Politicians and Image
Politicians recognise the fact that picture in a newspaper is worth many speeches. Images makes a person look more powerful. A news article without an image is often not read. Elections are more or less a number play, and in this Politics of image play an important role.
Images are enough to project a said politician as hard or soft, firm or rude, warm or cold. President
Donald Trump constantly uses images on social media to create an impression of a certain kind of success, power, and leadership. He also appears to be ever mindful of the image he portrays, putting great effort into his stance and gestures, and displays of presidential power.
One shot is enough to create an image in the minds of the viewer. Before elections, politicians often hire people to do their image building.
Criticism
However, visuals convey reality sometimes incorrectly. They can at times mould the actual reality to project a rather narrow version of an image. For eg- a video of Finland PM went viral showing her dancing, laughing and having fun, many people perceived it that instead of pursuing her duties, she usually engages in such activities but it could just be that she had completed all her work beforehand and enjoyed some free time after so many days.
Sometimes the slip of tongues or certain incidences are repeatedly projected by the media.
For eg- The recent case of Noida woman abusing her security guard, the slap was shown repeatedly by the media to grab the attention of the viewers and to project the vicious nature of the woman even if she had only slapped him once.
The play of TRPs and sensational news affects the politics of image in the media.
Factors affecting politics of image:
• The number of parties in the country for election related news
• Money power
• Ownership pattern of media
• Media Literacy
• The presence of media regulatory bodies which can have its both pros and cons
• Editorial cartoons and caricatures
• Sometimes edited and photoshopped images are presented on social media
ECOLOGY OF AN IMAGE
Introduction
● 'An ecology of images,' an approach to making and reproducing photographs that would protect both the meaning of particular pictures and the integrity of the reality they depict.
● Images are playing an increasingly important role in visualizing ecologies in our complex, designed world. Yet what of the ecological and ontological character of images themselves?
● The image is beyond representation of a dynamic form of design that takes hold of eye and mind. It is as capable of violence and ecological destruction as any material form.
● Ecology of the image designates a critical and reflexive engagement with the environments we have designed for ourselves - including the environment of signs - but it also points to the crucial role of meaning making and image creation in the broad project of Sustainability.
● This diagram summarizes the ideas behind an ‘ecology of images. An image always exists in a set of contexts. It is always part of an ‘image community’, which it works with or against.
● It is portrayed here with a honeycomb effect around the central image. Image community can be thought of as a genre and/or a modality of images.
● As such, there are formal aesthetic properties and particular content and uses an image might share with other images, or indeed with which it is attempting to work against or appropriate.
Unit II: Visual Grammar
• Basics of a Camera- (Lens & accessories)
• Camera Movement
• Types of shots, Focusing, Depth of field, Lighting
• Visual Perspective
A camera is a remote sensing device that can capture and store or transmit images. Light is collected and focused through an optical system on a sensitive surface (sensor) that converts intensity and frequency of the electromagnetic radiation to information, through chemical or electronic processes.
The image sensor is made up of millions of light sensitive photodiodes set on a grid, where each photodiode records a tiny portion of the image as a numeric value that corresponds to a specific brightness level, which is then used to create your image.
Basic Components of a Camera:
1. Lens: The lens is one of the most vital parts of a camera. The light enters through the lens, and this is where the photo process begins. Lenses can be either fixed permanently to the body or interchangeable. They can also vary in focal length, aperture, and other details.
2. Viewfinder: The viewfinder can be found on all DSLRs and some models of digital compacts. On DSLRs, it will be the main visual source for image-taking, but many of today’s digital compacts have replaced the typical viewfinder with an LCD screen.
Most digital cameras have two viewing systems – the optical viewfinder and the electronic viewfinder.
While both systems show you what the lens sees, the electronic viewfinder can tell you other things about the nature of your digital image. One area where the electronic viewfinder is superior is in determining colour balance.
3. Body: The body is the main portion of the camera, and bodies can be a number of different shapes and sizes. DSLRs tend to be larger bodied and a bit heavier, while there are other consumer cameras that are a conveniently smaller size and even able to fit into a pocket.
4. Shutter Release: The shutter release button is the mechanism that “releases” the shutter and therefore enables the ability to capture the image. The length of time the shutter is left open or “exposed” is determined by the shutter speed.
5. Aperture: The aperture affects the image’s exposure by changing the diameter of the lens opening, which controls the amount of light reaching the image sensor.
6. Image Sensor: The image sensor converts the optical image to an electronic signal, which is then sent to your memory card. There are two main types of image sensors that are used in most digital cameras: CMOS (Complementary Metal Oxide Semiconductor) and CCD (Charge-coupled Device). Both forms of the sensor accomplish the same task, but each has a different method of performance.
7. Memory Card: The memory card stores all of the image information, and they range in size and speed capacity. The main types of memory cards available are CF and SD cards, and cameras vary on which type that they require.
8. Flash: The on-board flash will be available on all cameras except some professional grade DSLRs. It can sometimes be useful to provide a bit of extra light during dim, low light situations. The camera’s computer determines the need for flash according to the exposure metering, focusing, and zoom systems. On compact cameras, the built-in flash is triggered to go off in perfect sync with the shutter, but it’s hard to control the timing and intensity of the flash. This can result in washed-out photos.
9. User Controls: The controls on each camera will vary depending on the model and type. Your basic digital compacts may only have auto settings that can be used for different environments, while a DSLR will have numerous controls for auto and manual shooting along with custom settings.
Camera movement is a filmmaking technique that causes a change in frame or perspective through the movement of the camera. Camera movement allows cinematographers and directors to shift the audience's view without cutting. Specific types of camera movements in film also can create a psychological and emotional effect on the audience.
Static shot: A static shot has no camera movement at all. It is achieved by locking a camera to a fixed-position typically with a tripod, used for showcasing an actors’ performance.
Pan shot: The camera pan directs a camera horizontally left or right, used to reveal new information to the audience.
When done quickly with speed, the fast camera movement is known as a whip pan. Whip pans are one of the best camera movements to add energy to a shot.
Tilt shot: Tilt camera movements direct a camera upward or downward. Jurassic Park, he uses the tilt when first introducing the dinosaurs. The camera tilt perfectly captures the emotions of the film’s characters while eliciting awe in the audience.
Roll: The camera roll is a rotational camera movement that rotates the camera over its side on its long axis. Rolls can be dizzying and unnatural. For this reason, filmmakers use it to disorient the audience or create uneasiness.
Tracking shot: A tracking shot is any shot that physically moves the camera through the scene for an extended amount of time.
Tilt and Pan shot: When the camera moves up to down and left to right also.
Low and High angle shot: The shot in which the camera moves from the legs of the character to the head.
TYPES OF SHOTS
Series of shots come together to form a scene. Series of scene come together to form a sequence. A shot must add to the information of the viewer.
Factors affecting shots
● Camera Angle
● Camera Movement
● Camera Distance
● Camera Height
1. Extreme Long Shot- The extreme wide shot or extreme long shot is all about showing the world in which the story takes place. In an extreme wide you will see large landscapes in the frame. Whether it is the desert or outer space.
2. Medium Shot- shows your character from the waist up. Medium shots are often used in dialog scenes. As we get closer to our subjects, we can see things that we wouldn’t catch in a wide, like body language. We can see crossed arms or someone who talks with their hands.
3. Close up Shot- frames the character’s face. In a close-up shot one can see even more detail that tells us how a character feels. A close-up highlights emotional clues in the eyes and you can see a twitch or a tear that you might miss in a medium shot. It similar to your passport size photograph.
4. Medium close up shot- Halfway between the close-up and the medium shot is the medium close-up that frames the subject from the shoulders up. This shot might be used if you want to show more body language as you capture some emotion and facial expressions.
5. Extreme close up shot- frames even tighter on a face (or subject), highlighting facial features more. It usually frames a particular part of the face like the eyes or the mouth. It is often used as more drama increases.
6. Two Shot:- When the scenes have only two people in the scenes known as two shots. If it has three character, it known as three shoot. If it has four character then it is known as four shot.
7. OVER THE SHOULDER SHOT:- When two characters are talking to each other in which one of the characters has the camera behind it but the other person is in the frame of the head to shoulder.
8. MOVING SHOT:-Moving the things or a person here to there.
9. REVERSE SHOT:-This is also known as reestablishment shot. The shot in which all three characters of the scenes has taken in the character without cutting the role of the third person.
10. HOOK:- Hook means the suspense scenes which create enthusiasm between the viewers.
11. FADE IN/OUT:- When the scenes fade slowly. That is known as fade In and when the scenes vanish slowly this is known as fade out.
12. SPLIT SCREEN- When the screen divided in the two parts and showing two different scenes this is known as split screen.
13. CONTINUITY:- When many shots taken a place and then edit known as continuity in the shots.
14. TRACKING SHOT:- The tracking path of the character taken by the camera known as the tracking shot.
DEPTH OF FIELD
WHAT IS THE DEPTH OF FIELD IN PHOTOGRAPHY?
● In photography, the depth of field (DoF) is the portion of the image that is crisp and in focus. The distance between the closest and farthest objects in your images determines how sharp and in focus they are.
● It is important to remember that depth of field refers to the amount of a picture that is acceptable for focus. Consider taking pictures of two subjects that are at two different distances from you. Depth of field is defined as the distance between two subjects when just that portion of the scene between them is crisp and everything else is blurry.
● Before we go into that, let's become familiar with the most prevalent depth of field terminology. Understanding depth of field is crucial for getting all the significant features you want in your image in focus.
WHAT IS SHALLOW DEPTH OF FIELD?
● The line between the closest and farthest parts that are crisp and in focus is quite short when there is a shallow depth of field; in other words, just a small section of your image is in focus.
● The short depth of field and narrow depth of field are other names for shallow depth of field. Understanding what a shallow depth of field is and, more importantly, when you should utilize one, is crucial.
● The factors listed below influence a shallow depth of field, with the aperture playing the most significant role (the larger the aperture, the shallower the depth of field).
WHAT IS A LARGE DEPTH OF FIELD?
● A larger depth of field, on the other hand, results in a bigger portion of the image is in focus. In this instance, there is a considerable distance between the closest and farthest components that are relatively sharp.
● Deep depth of field or huge depth of field are other terms for a large depth of field, which can be created by a number of factors, with the focal length being one of the most crucial. For instance, wider-angle lenses enable a deeper depth of field.
● Depending on the circumstances and the subject you are photographing, choose your depth of field.
● To choose the optimal depth of field for your photography, it is first important to understand the key elements that influence depth of focus so that you may subsequently modify the DOF in your photographs in accordance with your preferences and objectives.
DEPTH OF FIELD FACTORS IN PHOTOGRAPHY
Here are four main factors that affect depth of field in photography:
1. DEPTH OF FIELD AND APERTURE
● The relationship between aperture and depth of field is the first consideration.
● The depth of field in an image is correlated with the aperture in photography. Simply explained, the depth of field will be shallower if you choose a larger aperture. On the other side, the depth of field in your image will be bigger the narrower the aperture you pick.
● There are other factors that influence DOF, but as a depth of field photography tip, change your aperture first in accordance with the depth of field and the elements in your image that you want in focus before considering other exposure triangle adjustments.
2. FOCAL LENGTH AND DOF
● Your focal length and depth of field are the second determining factors.
● In this situation, the depth of field will be bigger the shorter the focal length, whereas the depth of field will be shallower the longer the focal length. According to this reasoning, a 50 mm lens' depth of focus will be shallower than a 35 mm lens' DOF.
● This is due to a physical cause. When using longer focal lengths, such as telephoto lenses, there is a relationship between magnification and depth of field; the elements in the far distance will be larger and, as a result, it will be harder to focus on a large area. When using a wide-angle lens, however, all the elements in the far distance are smaller, making it easier to focus on everything.
● The scene you wish to photograph will determine which lens has the best depth of field. For instance, telephoto and quick lenses with a wide aperture are frequently the best lenses for a shallow depth of field.
● Wide-angle lenses, on the other hand, are the greatest for having a deep field since you can focus at the hyperfocal distance and have nearly the entire image in a respectable focus.
3. FOCUSING ON DISTANCE AND DOF
● Distance and depth of field are likewise two components that cannot be separated. The depth of field will be shallower the closer you are to the topic you wish to photograph, and the depth of field will be bigger the farther away you are from the main subject.
● A helpful DOF tip is to change the depth of field before mounting the tripod so you may relocate if you need a greater depth of field because you are too close to the foreground. If you move in accordance with the depth of field and concentrating distance from your primary subject, you'll receive the desired outcome.
4. SENSOR SIZE AND DOF
● The link between the camera sensor size and depth of field is the final consideration.
● Given that we are utilizing the same focal length, the depth of field will increase with increasing sensor size while decreasing sensor size will result in a reduced depth of field.
● Please take note that we are comparing the depth of field with the same focal length. The depth of field will be smaller in cameras with larger sensors and greater in cameras with cropped sensors if we take into account the same field of view.
● Since the sensor size is cropped in APS-C cameras, for instance, normal focal lengths are multiplied by 1.5. The depth of field will be less in the cropped-sensor camera because, as we discussed above in the focal length vs. DOF section, a 35 mm in a full-frame camera is roughly equivalent to a 52 mm focal length in an APS-C camera.
FOCUS EFFECTS:
● Deep Focus: This approach keeps all elements in the frame. Used extensively by realists. In this approach, the foreground, middle ground, and background are all in relatively sharp focus. (viewer almost has a choice where they can focus on either of the subjects etc.)
● Shallow Focus: In this approach several different planes of focus are incorporated within a single image. This manipulates viewer’s attention. Shallow focus often uses the play between soft focus and sharp focus to achieve this. (more directed)
● Follow Focus: In which the focus is changed to permit the camera to keep a moving subject in focus. (the focus remains on the subject)
● Shifting Focus/Rack Focus: In which the focus is changed to direct attention away from one subject and toward another in a different ground. (shifting the gaze from one subject to another, the presence of the camera more evident).
LIGHTING
What Does Light Mean in Photography
When it comes to photography, the type of lighting that you use is one of the most important elements of any photo. Light in photography refers to how the light source, which can be natural or artificial, is positioned in relation to your subject.
How lighting affects your photography?
Whether you’re doing portrait photography or still life or landscape, so much of your lighting choices will depend on the features of your subject, and how you want them to be portrayed in your photos.
If you’re doing a beauty shoot where the focus is flawless features, the type of lighting that you use will likely be very different than a photoshoot where you want to emphasize the personality and distinctive lines of your model’s face.
Different Types of Lighting
There are two main kinds of light: natural and artificial. Natural light is anything that occurs without human intervention, so it can be the direct light of the sun on a bright day, the diffused light created by a cloudy or foggy day, or even the light of the moon at night. On the other side of things, artificial light can often be moved around and adjusted to fit your situation.
Natural Light
If you want to use natural light in your photography, it’s important to understand the angle of the sun and how that will affect your composition. For example, for most of the day, the sun is directly overtop so your subject will be lit from above. A sunny day without clouds will result in more intense shadows, while a sky full of clouds will diffuse the sunlight so that the contrast of light on your subject is less harsh.
Front Light (or Flat Light)
Front light occurs when the light source is directly in front of your subject. The light will be spread evenly across the photo, with no section more or less exposed than the rest.
Flat light can be good for portraits, especially if your subject has wrinkles or blemishes that they want to de-emphasize.
Backlight
Backlit photos are when the light source is behind the subject, with the subject in between the light and your camera. This can be a great opportunity to play with silhouette and long shadows in your photography.
Soft Light
Soft lighting occurs when your light source is diffused, so that the effect is more subtle than it would be with a direct source of light. By using soft light, you will end up with less intense shadows, if any at all, and a lower contrast between the darks and lights in your photo.
Hard Light
The opposite of soft or diffused light, hard lighting is when your light source is pointed directly at your subject. It results in high contrast and intensity, bright whites and dark shadow, and is often created by making use of the midday sun.
Rim Light
Rim light can be created using a form of backlighting, where the light is at an angle from behind or above. The light will hit your subject in a way that creates a glowing outline or highlight around the subject, depending on the direction that your light is coming from. This technique is useful for distinguishing the subject from the background by providing definition.
Loop Lighting
Loop lighting is a specific technique used for portraits. The name refers to a “loop” of shadow from the nose on the cheek.
Broad Lighting
Often used for graduation photos, broad light for photography is a type of side lighting where the side of the model closest to the camera is lit, and the side farther away is in shadow
Short Lighting
Short lighting is pretty much the exact opposite of broad lighting. In this case, the side of the face that is closest to the camera is in shadow, whereas the farthest side is in the light.
Split Lighting
When the light hits your subject at a 90-degree angle, that is called split lighting. This results in a straight line down the center of your subject’s face, with one side entirely lit and the other side completely in shadow. This is a great option for a dramatic portrait.
VISUAL PERSPECTIVE
INTRDOUCTION
Visual perception is the ability to see and interpret one's visual environment. It is the brain's ability to make sense of what the eyes see. The visual perception definition does not only include seeing; it also includes organising and interpreting visual information.
TV AUDIENCE
Television audiences have a short attention span. Place many objects in a room and include a TV screen, the eye will naturally be drawn to the TV, first. Its glowing phosphors attract human attention greater than anything else. Once the “data” from the screen is processed, other objects in the room gain attention.
How to keep the imagination interested. The imagination constantly scans the visual information for clues for meaning. It looks from left to right and top to bottom. It reads the picture in the same pattern that books are read. Once it reaches the conclusion that it has read everything, it stops generating pleasure. The audience member either gets sleepy or takes action to find other visual information.
Stimulating the imagination has been an important part of theatre for the past century. When an actor prepares a role, he works to create an imaginary world using his imagination to create a visual picture. He recalls as many memories as possible from all five senses to stimulate a picture in his mind. For TV, pictures and sound when used to suggest the other three senses, touch, taste and smell, keep the imagination interested. This explains why cooking shows, although visually simple, are so popular.
SOUND IN TV
Time for the imagination is expressed in music. Time is rhythm and tempo. In a live concert, music alone is enough to stimulate the imagination. But on TV, sound is secondary. However, rhythm and tempo have an equivalent visual expression.
Russian filmmakers from the early days of silent filmmaking understood the importance of speed of changing shots to keep the eye interested. This was the beginning of montage editing. Some shots were longer in time and others quick paced just like a musical score.
This is the practice in talk shows and news interviews in all of modern TV. However it is not enough.
For example, there is a background image and then one or more images inside windows and then foreground text to be read. The speed at which these elements hide and appear creates a visual musicality. Sometimes, there are three elements, sometimes four and five and then one, and so on. This is known as layering.
Visual elements have priority over sound.
To test if a TV program has visual priority, turn off the sound. Can the show be understood? Text as a layer on the screen should guide the viewer through the story.
Sports shows are popular because they can be understood without the sound. The score and time are visually displayed. The same announcer describing a sports event on TV uses more emotion than on radio where he must paint a visual picture. The cheering crowds give the imagination, audio clues as to how to interpret the pictures he is seeing.
EMOTIONAL VISUALS
Strong emotional visuals win over the portrayal of fact. In American TV journalism history, the defining moment was the reporting of Walter Cronkite in 1963 on the death of US President Kennedy. He choked up live and cried on camera. His emotion broke the stereotype of the unemotional host. The information he provided was forgotten but his feelings were not. This shows that the past solution of highly charged negative emotions is NOT the only solution for TV. The Orthodox emotions of love, heroism, confession and forgiveness are strong positive emotions lacking in the media space. Honesty and confession of bias, of our journalist on air will break through the natural suspicion of the viewer.
CONCLUSION
Every kind of news focuses on different visuals be it Horror, Sports, Discovery Programmes, etc. Covid -19 TV broadcast had both factual and emotional appeal. The way Bhopal Gas Tragedy, reporting of 9/11.
UNIT III: Elements in Broadcast news
• Electronic News Gathering (ENG) & Electronic field Production (EFP) (Concept)
• Elements of a Television News Story: Gathering, Writing/Reporting.
• Elements of a Television News Bulletins
• Basics of Editing for TV- Basic Softwares and Techniques (for editing a news capsule)
EFP
Electronic field production, or “EFP” is a television-industry term for a video production that takes place “in the field” — in other words, outside of a formal television studio.
Examples of EFP are everything from truly outside, field events such as nature documentaries or reporting on spontaneous riots or car chases, to sporting events, awards shows, concerts, interviews that take place outside of a formal production studio.
EFP crews range from a single camera operator or crew of two (camera operator with sound mixer) capturing high-quality imagery, to a multiple-camera setup utilizing videography, photography, and advanced graphics and sound and often an entire mobile production truck. It also includes mobile journalism, and sometimes uses OB vans.
Sports broadcasts make up the majority of EFPs.
It takes into use eight basic elements of production-
● The camera
● Lighting
● Audio
● Switching
● Videotape recording
● Tapeless system
● Postproduction editing
● Special effects
ENG
Electronic news-gathering (ENG) is when reporters and editors make use of electronic video and audio technologies in order to gather and present news. The process of reporting events and activities that occur outside the television studio. In modern news operations, however, it also includes SNG (Satellite News Gathering) and DSNG (Digital Satellite News Gathering)
ADVANTAGES OF ENG
i) Speed: The technology grants speed to news reporting as the videotapes and digital discs from the ENG machines are available and ready for editing immediately after they are recorded. The use of ENG has made television news reporting more lively, faster, and flexible. The print medium may wait to tell their readers what has happened in their next edition but the television medium is showing and telling the viewers exactly what is happening now and with vivid pictures! Indeed ENG brings the people to the news events.
ii) Editing flexibility: The technology allows for the quick construction of a basic news story. Audio and visual effects can be added for emphasis. This means that the reporter can edit his stories right there at the locations even before reaching the studio.
iii) Mobility: The use of helicopters, microwave and satellite news gathering equipment have made it possible to reach further and faster to any part of the world to cover present happenings. This therefore adds depth and breadth to news coverage as well as goes live from the scene of a story.
TEAM WORK
• A successful ENG team produces a video segment that enables viewers to experience the event as if they were there.
• The reporter and Cameraperson must be prepared for every situation that might present itself during the reporting of an event.
• Library and internet resources to obtain background information on a larger scale.
• Phone calls and face-to-face informal meetings to obtain background information.
EQUIPMENT
• The camera is now the most powerful weapon in the world.
• Always check the camcorder and recording equipment before leaving the studio. Perform an Audio & Video check before leaving the studio.
• Bring the microphones for various purpose.
• POWER Supply: Charged batteries, Charger, Ac power supply, extra microphone batteries.
• Storage Devices, tripod, headphones
• Portable lighting Devices
• Pen or pencil and paper for note taking.
DIFFERENCE BETWEEN ENG AND EFP
• They are very similar but in EFP- electronic field production- but it is carefully planned out much like a studio production and in ENG- electronic news gathering- there is no time for preproduction because you are recording unplanned events like breaking news or developing news.
• ENG mostly covers breaking news whereas EFP covers documentary, sports, parades.
• Contrasted with the production values of EFP, in electronic journalism or ENG, the emphasis is on quickness and agility in acquisition and rapidity in the process of editing, leading to final transmission to the audience is the goal. The two terms are often seen paired as EFP-ENG and vice versa.
ELEMENTS OF TV NEWS BULLETIN:
TV EDITING
INTRODUCTION: CONCEPT OF TV NEWS EDITING
● The concept of editing in television news differs from that of editing news for print publications.
● In commercial television and film production, the "shooting ratio," or amount of film or tape shot divided by the length of the finished program, averages about 10:1. In other words, for every minute of edited program you see, ten minutes or more of tape was recorded by the camera crew.
TRANSFORMATION OF TV EDITING
Development of editing process for television broadcasting can be divided into three phases:
● Physical cutting of film:
In the initial phase, there was no concept of editing. It was the cameraman who stopped cranking (rolling) the magazine (film roll) when the shot was over or the shot changed and started cranking again for the next shot. Each shot was planned in such a way that the entire act had to be performed in a single go without any cuts.
● Electro-magnetic video tape editing:
During 1950s, magnetic videotape recording machines were invented, and the editing process became much easier. Ampex Corporation manufactured VR-1000, a black and white videotape recorder (VTR). It had the same prevailing linear method of editing.
● Digital Non-Linear Editing:
Non-linear editing system was introduced in the form of CMX 600 in 1971. It was specified as a RAVE or Random-Access Video Editor by CMX. It had two black and white monitors, one for edited video and the second one for preview. In the preview monitor the editor could view the original footage and select the editing points.
Gradually the technology developed and video editing became easier. Today, there are many such apps available, with the help of which people can do basic video editing on their smartphones.
VIDEO EDITING IN TV
● The ultimate aim of any editing process is to create a final product i.e., film or any video programme. We can divide the editing process into two stages: rough edit and final edit.
● Rough edit is just putting relevant videos or visuals with the audio track. These are fine-tuned later on by trimming/expanding as per the requirement of the story.
● Adding the suitable VFX and transitions to any film/programme makes the entire production crisp, meaningful and attractive.
TWO TYPES OF EDITING: LINEAR AND NON-LINEAR
1) Linear editing:
● Linear video editing is a mechanical process that uses linear steps one cut at a time (or a series of programmed cuts) to its conclusion. It also uses Camcorders, VCRs, Edit Controllers, and Mixers to perform the edit functions.
2) Non-Linear editing:
● In digital video editing, this method allows you to access any frame in a digital video clip regardless of the sequence in the clip.
● This method is similar in concept to the cut-and-paste technique used in film editing from the beginning and allows you to include fades, transitions, and other effects.
● In non-linear editing, the original source files are not lost or modified during audio and video editing.
● Loss of quality is also avoided due to not having to repeatedly re-encode the data when different effects are applied. This is the reason why the demand for non-linear video editors is high in the film and video industry.
SOFTWARES USED IN NONLINEAR EDITING
● Few examples of both types of NLE software (proprietary and open-source) are as follows-
Proprietary NLE Software
● Adobe® Premier Pro®
● Sony® Vegas Pro 17®
● Apple® Final Cut Pro X®
● Pinnacle Systems® Pinnacle Studio 23 Ultimate®
● Lightworks®
Open-source NLE Software
● Openshot Video Editor®
● Blender®
● HitFilm Express®
Different editing software may have similar or slightly different function tools, transitions/effects, correction/adjustments and outputting controls but the techniques of editing remain almost same.
VARIOUS TYPES OF TECHNIQUES
1. Continuity Editing Approach:
● Continuity editing approach supports the logical, smooth and seamless progression of the story. According to this approach, viewers should not feel any jerk or distraction while watching a film or any television programme.
● As an editor we have two major challenges. First, to tell the whole story within the time limit of the film or programme, and second, to present the story in such a way that viewers should find it seamless without any visual or psychological jerks or distractions.
2. Parallel Editing
● This is sometimes also termed as cross-cutting. It is an editing technique which is used to cut between shots of two or more actions or events. It suggests that these actions are going on at different places, generally at the same time.
● Cross-cutting includes more than one shot which are used simultaneously to link the story together. It can be used to show the two sides of a story.
3. Montage Editing
● In this editing technique, a series of different types of short shots are put together in a sequence, generally to compress the time and space. Montage Editing is almost opposite to the continuity editing.
● It may also be used to Editing give viewer a sense of overall summary of the story with different shots. Montage can be used to create dynamic rhythm in a scene.
4. Video Transitions
● We use video transitions to connect two shots one after the other. They provide the viewers with a seamless and smooth visual experience that is the base of continuity editing approach.
● Cut is the most basic but most powerful and maximum used transition. Fade, dissolve, wipe, slide, page peel, iris, etc. are other popular transitions. Transitions are used in audio also. You should always use video transitions with a purpose.
No comments:
Post a Comment