Research Summary
We demonstrate how a novel synthesis of three methods—(a) unsupervised topic modeling of text data to generate new measures of textual variance, (b) sentiment analysis of text data, and (c) supervised ML coding of facial images with a cutting‐edge convolutional neural network algorithm—can shed light on questions related to CEO oral communication. With videos and corresponding transcripts of interviews with emerging market CEOs, we use this synthesis of methods to discover five distinct communication styles that incorporate both verbal and nonverbal aspects of communication. Our data comprises interviews that represent unedited expressions and content, making them especially suitable as data sources for the measurement of an individual's communication style. We then perform a proof‐of‐concept analysis, correlating CEO communication styles to M&A outcomes, highlighting the value of combining text and videographic data to define styles. We also discuss the benefits of using our methods versus current research methods.
Managerial Summary
CEOs spend most of their time communicating to investors, customers, and partners with the aim of influencing these various stakeholders. To what extent though does their effectiveness as leaders depend on a mixture of what they say and how they say it? We use cutting‐edge machine learning approaches to measure a CEO's communication style, which can give clues about the major strategic decisions a CEO's firm must make. With a collection of video interviews with 61 organizational leaders from emerging markets, we use textual analysis and facial image expression recognition to code whether CEOs are “excitable,” “stern,” “dramatic,” “rambling,” and “melancholy” in their communication styles. As a proof‐of‐concept, we also show that CEOs who were more dramatic in expressing themselves were also less likely to oversee major acquisitions. Therefore, not only can CEO communication styles help predict a firm's ability to grow, adapt to change, and reallocate existing assets, styles can also be coded more intuitively by using our new method, representing a vast improvement over previous methods in both accessibility and interpretability.