Content analysis or textual analysis is a methodology in the social sciences for studying the content of communication. Earl Babbie defines it as "the study of recorded human communications, such as books, websites, paintings and laws". Content analysis is considered a scholarly methodology in the humanities by which texts are studied as to authorship, authenticity, or meaning. This latter subject includes philology, hermeneutics, and semiotics. Harold Lasswell formulated the core questions of content analysis: "Who says what, to whom, why, to what extent and with what effect?" Ole Holsti offers a broad definition of content analysis as "any technique for making inferences by objectively and systematically identifying specified characteristics of messages", while Kimberly Neuendorf provides a six-part definition: "Content analysis is a summarising, quantitative analysis of messages that relies on the scientific method and is not limited as to the types of variables that may be measured or the context in which the messages are created or presented." Description
In 1931, Alfred R. Lindesmith developed a methodology to refute existing hypotheses, which became known as a content analysis technique. It gained popularity in the 1960s, when Glaser referred to it as “The Constant Comparative Method of Qualitative Analysis”. Glaser and Strauss later adapted it to formulate “Grounded Theory". The method of content analysis enables the researcher to include large amounts of textual information and systematically identify its properties, such as the frequencies of most used keywords by locating the more important structures of its communication content. Such amounts of textual information must be categorised to provide a meaningful reading of content under scrutiny. For example, David Robertson created a coding frame for a comparison of modes of party competition between British and American parties. It was developed further in 1979 by the Manifesto Research Group aiming at a comparative content-analytic approach on the policy positions of political parties. Since the 1980s, content analysis has become an increasingly important tool in the measurement of success in public relations programs and the assessment of media profiles. In these circumstances, content analysis is an element of media evaluation or media analysis. In analyses of this type, data from content analysis is usually combined with media data . It has also been used by futurists to identify trends. In 1982, John Naisbitt published his popular Megatrends, based on content analysis in the US media. The creation of coding frames is intrinsically related to a creative approach to variables that exert an influence over textual content. In political analysis, these variables could be political scandals, the impact of public opinion polls, sudden events in external politics, inflation etc. Mimetic Convergence, created by Fátima Carvalho for the comparative analysis of electoral proclamations on free-to-air television, is an example of creative articulation of variables in content analysis. The methodology describes the construction of party identities during long-term party competitions on TV, from a dynamic perspective, governed by the logic of the contingent. This method aims to capture the contingent logic observed in electoral campaigns by focusing on the repetition and innovation of themes sustained in party broadcasts. According to such post-structuralist perspective from which electoral competition is analysed, the party identities, 'the real' cannot speak without mediations because there is not a natural centre fixing the meaning of a party structure, it rather depends on ad-hoc articulations. There is no empirical reality outside articulations of meaning. Reality is an outcome of power struggles that unify ideas of social structure as a result of contingent interventions. In Brazil, these contingent interventions have proven to be mimetic and convergent rather...
Please join StudyMode to read the full document