Abstract
Billions of users interact with social platforms daily, but what each user sees, what content is displayed to them, and who is seen is not the product of free choice, but rather the result of algorithmic mechanisms that distribute power in these spaces. A development that has made the algorithm a central element in organizing the flow of information and the distribution of symbolic capital. The existing literature has mainly examined this phenomenon from two perspectives: the tool-oriented approach, which considers the algorithm as a technical and neutral mechanism, and the autonomous actor approach, which, inspired by Latour's actor-network theory, considers the algorithm as a counterpart agent to the human actor. This article argues that both approaches suffer from serious theoretical blind spots: the former ignores power structures, and the latter risks anthropomorphizing technology by equating human and non-human agency. Drawing on Bourdieu’s theory of the field, this research analyzes social platforms as digital fields in which actors compete for access to symbolic capital, namely visibility, attention, and prestige. This analysis shows that algorithms regulate the distribution of this capital through three mechanisms: ranking, recommendation, and visibility management. Accordingly, the article proposes the concept of the “structural actor,” a mechanism that is neither autonomous nor neutral, but rather operates within the logic of the field and participates in the reproduction of power relations. This concept adds a theoretical contribution to the literature on platform studies, shifting the central question from “What did the designers intend?” to “What logic does the digital field reproduce through algorithms?”.
|
Published in
|
Humanities and Social Sciences (Volume 14, Issue 2)
|
|
DOI
|
10.11648/j.hss.20261402.22
|
|
Page(s)
|
161-170 |
|
Creative Commons
|

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.
|
|
Copyright
|
Copyright © The Author(s), 2026. Published by Science Publishing Group
|
Keywords
Social Platforms, Algorithms, Digital Field, Symbolic Capital, Structural Actor, Bourdieu’s Theory of the Fiel
1. Problem Statement
1.1. The Spread of Algorithmic Platforms in Digital Society
Over the past decade, social platforms have undergone a fundamental transformation that fundamentally differentiates their nature from what they were originally designed to be. What was initially designed primarily as a space for interpersonal communication and information exchange has gradually evolved into a complex infrastructure for organizing the flow of information, controlling the focus of users’ attention, and distributing power in digital environments
| [15] | Gillespie, T. (2014). The relevance of algorithms. Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press,
https://doi.org/10.7551/mitpress/9780262525374.003.0009 |
| [28] | van Dijck, J., Poell, T., & de Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press, https://doi.org/10.23860/mgdr-2018-03-03-08 |
[15, 28]
.
What distinguishes this transformation from quantitative growth is not an increase in the number of users or the expansion of Internet access, but a fundamental change in the internal logic of platforms, a change that has shifted the main question from “who uses” to “who is seen and how power is distributed.” The main question is no longer who uses these platforms, but who is seen, what content is shown to whom, and how power is distributed in this space. In scientific terms, behind these processes lies a hidden force called the "algorithm." The algorithm is the invisible hand that, by suggesting new content to users of social platforms, guides them in the direction it has determined, while many users of these platforms are unaware of the existence of such a hidden system
| [7] | Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press. |
[7]
.
Platform studies research suggests that these algorithmic mechanisms play an active role in organizing the flow of information. Bucher describes this situation with the concept of “visibility politics,” showing that algorithms implicitly determine which content or actors are most visible. Gillespie also argues that through these same mechanisms, platforms have transformed from purely technical intermediaries to institutions that actively participate in shaping the circulation of content and social interactions
.
1.2. Theoretical Approaches to Algorithms in Media Studies
In the past two decades, as the role of algorithms in social life has become more prominent, researchers in the fields of media studies and social sciences have also presented a wide range of theoretical approaches to this topic. These approaches can be categorized into six main discursive clusters: theories of power, governance, and transparency that see algorithms as institutions that manage social order
; theories of political economy and infrastructure that analyze algorithms within the logic of platform capitalism
; theories of culture, taste, and capital that see algorithms as cultural products
; Theories of justice, inequality, and bias that focus on the reproduction of structural discrimination
; theories of agency, play, and resistance that emphasize user action against algorithms
| [10] | Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913.
https://doi.org/10.1177/1461444818815684 |
| [11] | Cotter, K. (2021). "Shadowbanning is not a thing": Black box gaslighting and the power to independently know and credibly critique algorithms. Information, Communication & Society, 26(6), 1226–1243.
https://doi.org/10.1080/1369118X.2021.1994624 |
[10, 11]
; and theories of systemic logic and media effects that analyze algorithmic feedback loops
.
Despite the richness of this literature, each of these clusters sees only a part of the algorithm phenomenon, and none of them alone is able to explain the relationship between algorithm structure, actor competition, and power distribution in a coherent framework. Among these approaches, two perspectives are most present in the literature and have had the greatest impact on the way algorithms are understood and policy-making about them: the tool-oriented view, which sees the algorithm as a technical and neutral tool, and the autonomous actor view, which sees it as a counterpart to the human actor. A careful examination of each of these perspectives and the research gaps related to them allows us to pose the main question of this research.
1.2.1. Instrumental View of Algorithms
In a significant part of the technology literature and the official discourse of platform companies, algorithms are still described as purely technical mechanisms. In this framework, the algorithm is reduced to a set of computational instructions whose task, in addition to processing data, is to manage content and optimize the user experience. Such a view turns the algorithm into a technical tool that serves human-designed purposes without having an independent social role. This view is known in the science and technology studies literature as the “instrumental view,” and its fundamental assumption is that technology is neutral to social relations
.
The roots of this approach can be found in the positivist tradition of engineering and computer science, where technology has always been a tool, not a social phenomenon. In this tradition, technologies are largely seen only as neutral tools for solving technical problems, and their social value is a function of how humans use them, not the nature of the technology itself. It is this perspective that allows large companies to attribute any negative social consequences of algorithms to human error or human misuse of the tool, rather than to find faults in the nature of the system itself
. In other words, this interpretation of this approach suggests that technology companies should abdicate responsibility for the social consequences of algorithms and present algorithms simply as neutral tools.
This framework of thought has not only remained in academic circles, but is also reproduced in the official narrative of technology companies; when Meta or Google introduce their algorithmic systems as "recommendation systems" or "optimization engines" whose main goal is to improve the user experience, they translate exactly this assumption of neutrality into business language. In such a narrative, algorithms are seen as mechanisms that merely increase the efficiency of the system and have no role in shaping power relations or social inequalities
. Caplan and Boyd show that these official narratives are part of the institutional strategy of platforms, through which they try to minimize accountability for the social consequences of algorithms
.
However, critical research in recent years has shown that this picture of algorithms is not sufficient to explain the complexity of their role in digital environments. One of the important terms in the field of algorithm studies is the term “black box,” which Pasquale used because he believes that many algorithmic mechanisms are kept hidden from the public eye, when the user does not know how the algorithm makes decisions, and therefore cannot hold a specific person responsible for accountability and transparency
. Noble argues in his research that algorithms that appear to be neutral can actually reproduce and exacerbate racial and gender inequalities. Similarly, another author, in his study of algorithmic systems in public services, argues that these systems systematically discriminate against vulnerable groups
.
As current research has shown, a purely instrumental perspective is unable to explain the structural role of algorithms in shaping power relations and the distribution of opportunities in digital environments. Therefore, from the review of all this research, we can conclude that algorithmic technology is not absolutely neutral and the hypothesis of technological neutrality can easily be challenged.
1.2.2. The Autonomous Actor’s View of Algorithms
In contrast to researchers who had a tool-oriented view of algorithms, there are other researchers in the field of critical technology studies and media studies who have a completely different approach to algorithms and seek to present different opinions from the previous group. From the intellectual perspective of these researchers, the algorithm system is not only not introduced as a passive tool, but is also considered as an independent actor with a degree and extent of freedom.
The theoretical origin of this group mainly stems from Latour's famous Actor-Network Theory (ANT) . Latour argues that algorithms, like human actors, can be embedded in a network of socio-technical relations and influence the course of social action
. What distinguishes ANT from previous approaches is that it assumes that “humans are different from machines,” that is, the traditional distinction between humans and nonhumans, and specifically between society and technology, is not a fixed truth but a questionable assumption. Latour argues that anything, whether human or algorithm, that can influence a relationship and make a difference in the network of relationships has some kind of agency.
Accordingly, algorithms are also considered as “mediators” that actively participate in the production and reproduction of social reality, rather than simply neutral transmitters of information
. This framework allows researchers to examine the active role of algorithms in shaping user experiences, distributing information, and organizing social interactions.
In the platform studies literature, this approach has been expanded upon in the form of concepts such as “algorithmic agency” and “infrastructural agency.” Using this framework, Bucher shows that social platform algorithms not only distribute content, but also actively determine who and what content has a greater presence in the digital space through the “politics of visibility”
| [7] | Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press. |
| [15] | Gillespie, T. (2014). The relevance of algorithms. Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press,
https://doi.org/10.7551/mitpress/9780262525374.003.0009 |
[7, 15]
. Also argues that platforms play an active role in mediating information through their algorithms, transforming them from purely technical intermediaries to institutions with the power to shape public discourse. Seaver also studies music recommendation systems and shows that algorithms operate through “distributed agency” in a network of relationships between data, users, and technical infrastructure
.
However, by equating human and non-human agency, this approach ignores the unequal power structures within which algorithms operate and carries the risk of anthropomorphizing technology, a critique that is explored in detail in Section 4.
1.3. Research Problem and Questions
The theoretical gap between these two approaches, which is shown comparatively in
Table 1, forms the fundamental question of this article. The instrumental approach does not see the structure and the independent actor approach ignores power; Neither is able to explain the relationship between algorithm, field structure, and power distribution in a coherent framework.
Table 1. Comparison of two different approaches to algorithms in social platforms.
Dimension | Instrumental View | Autonomous Actor View |
Definition of algorithm | A set of computational instructions for processing data | A non-human actor with independent agency within a network of relations |
Source of effect | The goals and decisions of designers and platform companies | The web of relations between humans and non-humans |
Relation to power | Neutral — no independent role in distributing power | An active force, but with no real account of power structures |
Social effect | An unintended consequence or a fixable technical error | The outcome of translation and transformation across a network of actors |
Blind spot | Power structures and inequality | Anthropomorphizing technology and overlooking institutional relations |
This article, based on Bourdieu's field theory, aims to go beyond these two current theoretical spectrums and presents a framework in which both the power structure plays a central role and avoids the anthropomorphism of technology. The main question of research is:
If social platform algorithms cannot be reduced to either a technical tool or an independent actor, what place do these algorithms have in the structure of the digital field and through what mechanisms do they play a role in the distribution of symbolic capital?
To answer this question, three sub-questions are raised:
What structural features do social platforms have that allow them to be analyzed as a digital field in the Bourdieuian sense, and what form does symbolic capital take in this field?
Through what mechanisms do social platform algorithms regulate the distribution of symbolic capital in the digital field?
How does the concept of "structural actor" differ from existing theoretical approaches and what contribution does it add to the literature of platform studies and algorithm studies?
2. Digital Field and the Position of Algorithm
2.1. The Concept of Field and Symbolic Capital in Bourdieu's Theory
In the theoretical tradition of sociology, the concept of field in Bourdieu's thought is one of the important analytical frameworks for understanding how power relations, competition, and resource distribution are formed in social spaces. Bourdieu defines the field as a social space in which actors with different positions compete with each other within a set of relatively stable rules for access to different types of capital
. This competition is not based on an explicit and predetermined agreement, but on a set of implicit and often unwritten rules that have been established over time and that guide the behavior of actors. Each field has its own internal logic; that is, the criteria for valuation, the types of capital that are valid, and the methods of acquiring a position in each field may differ from those in other fields. Despite these differences, the general structure of competition, positioning, and distribution of capital is observable in all fields.
In Bourdieu’s theoretical framework, the position of actors in each field is determined by four types of capital: economic capital, cultural capital, social capital, and symbolic capital, each of which can be transformed into the others under appropriate circumstances. Bourdieu sees symbolic capital as a form of social prestige, recognition, and legitimacy that often results from the transformation of other types of capital—such as economic, cultural, or social capital—into symbolic form
| [4] | Bourdieu, P. (1986). The forms of capital. In J. Richardson (Ed.), Handbook of theory and research for the sociology of education (pp. 241–258). Greenwood Press,
https://doi.org/10.4324/9780429494338 |
[4]
. In many traditional social fields, this capital was reproduced through formal institutions, educational qualifications, professional networks, or social rituals.
However, in digital environments, part of the mechanisms of production and distribution of this type of capital has been transformed and is tied to the technical infrastructure of the platforms. In the digital field, all three of the first capitals have an input role: high-quality content as cultural capital, the network of followers as social capital, and financial power as economic capital, but the algorithm operates precisely at the point where these capitals become symbolic capital; that is, visibility, attention, and credibility. Hence, the analysis of this article remains focused on symbolic capital as the main output of the digital field.
On social platforms, indicators such as the number of followers, the level of user engagement with content, and the level of visibility in the information flow have gradually become visible forms of symbolic capital. Empirical research in the field of platform studies shows that having such indicators can lead to broader economic, cultural, and professional opportunities; for example, influencers who achieve high levels of visibility and engagement on platforms often achieve commercial partnerships, advertising contracts, and greater access to economic resources
. Therefore, understanding how symbolic capital is distributed in platform environments requires paying attention to the mechanisms that organize the flow of visibility and user attention.
2.2. Social Platforms as Digital Fields
In recent years, the application of the concept of field to the analysis of social platforms has received increasing attention in the media and platform studies literature. Many scholars argue that platforms such as Instagram, YouTube, or TikTok possess many of the structural features of a social field. These environments are populated by a set of actors with different positions, there is competition for scarce resources, especially users’ attention and visibility—and the behavior of actors is shaped by a set of implicit rules that have gradually become entrenched in the logic of the platform
.
In the digital field, symbolic capital takes on a new form that is superficially different from what Bourdieu described in traditional fields but follows the same logic: visibility, interaction, and credibility have replaced academic credentials and professional networks. Content visibility, user engagement, and engagement indicators such as likes, comments, and reposts become criteria that determine the position of actors in the digital field. Content creators, influencers, brands, and even ordinary users compete for these resources and gradually develop strategies to increase their visibility. Some research suggests that over time, actors on platforms internalize a kind of digital “game sense,” meaning they learn how to align their behavior with the logic of the platforms and algorithmic expectations
.
This competition, however, does not take place in a completely open and equal space, as the flow of information on social platforms is largely regulated by algorithmic mechanisms that structure the paths to visibility. Cotter’s empirical study of Instagram influencers shows that content creators constantly try to adapt their behavior to algorithmic changes to increase the likelihood of their content being seen
| [11] | Cotter, K. (2021). "Shadowbanning is not a thing": Black box gaslighting and the power to independently know and credibly critique algorithms. Information, Communication & Society, 26(6), 1226–1243.
https://doi.org/10.1080/1369118X.2021.1994624 |
[11]
. These findings suggest that algorithms not only play a role in the distribution of attention, but also indirectly shape the strategies and practices of actors in the digital arena.
2.3. The Place of the Algorithm in the Structure of the Field
In this framework, an important question arises about the mechanism regulating the rules of competition and the distribution of positions in the digital field, the answer to which lies in the role of algorithms. Algorithms in these environments are not simply technical tools for processing data, but are also infrastructures that organize the flow of information, the order of content display, and the paths of users' attention
.
In practice, these mechanisms operate not through a single mechanism, but through three interrelated and mutually reinforcing processes that collectively determine the distribution of attention and symbolic capital in the digital field. Ranking algorithms determine which content is placed higher in the information flow. Content recommendation systems suggest certain posts or accounts to users, thus creating new paths of content discovery. In addition, some filtering or limiting mechanisms can reduce the visibility of certain content.
Recent research suggests that these algorithmic mechanisms play an important role in shaping the distribution of attention on platforms and, consequently, in influencing how actors access symbolic capital
. Algorithms, in this framework, are part of the structure of the digital field — they are the mechanism that regulates access to symbolic capital, a role that Section 4 examines in detail.
Figure 1. The structure of the digital field and the place of the algorithm in it.
As
Figure 1 shows, the digital field is composed of four main elements. Actors interact with the algorithm through competition for visibility; the algorithm both enforces the rules of the field and regulates the distribution of visibility; and this distribution of visibility reproduces the actors’ positions through the accumulation of symbolic capital. The feedback loop at the bottom of the diagram represents the continuous reproduction of the hierarchical structure of the field
| [5] | Bourdieu, P. (1993). The field of cultural production: Essays on art and literature. Columbia University Press,
https://doi.org/10.2307/431688 |
| [7] | Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press. |
| [11] | Cotter, K. (2021). "Shadowbanning is not a thing": Black box gaslighting and the power to independently know and credibly critique algorithms. Information, Communication & Society, 26(6), 1226–1243.
https://doi.org/10.1080/1369118X.2021.1994624 |
[5, 7, 11]
.
3. Algorithms and the Distribution of Symbolic Capital
3.1. Algorithms and Content Ranking
In many contemporary social platforms, the order in which content is displayed in the information stream that users encounter is organized largely through ranking algorithms. At first glance, this mechanism may be viewed as merely a technical solution to organize the vast amount of data; However, how content is ranked has important implications for the distribution of users’ attention and can thus influence the distribution of symbolic capital in the digital arena. Content ranking in this context means that the algorithm determines which posts are placed higher in users’ feeds, which videos are displayed in suggested sections, and what types of content are more likely to be seen by users.
Studies in the field of algorithmic studies show that ranking systems typically rely on a set of behavioral signals. Indicators such as user engagement rate, content viewing duration, and past user behavior patterns are used as metrics to predict the likelihood of engagement with content on many platforms
| [7] | Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press. |
[7]
. In such situations, content that already has a higher level of symbolic capital—for example, a higher number of followers or a higher level of engagement—often has a better chance of being placed higher in the information flow. This superior position can lead to greater visibility and engagement, and thus generate more symbolic capital.
Some scholars have described this process as a kind of “logic of reinforcement”—a cycle in which greater visibility can lead to greater engagement, and greater engagement in turn increases the likelihood of content being seen
| [10] | Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913.
https://doi.org/10.1177/1461444818815684 |
[10]
. From the perspective of Bourdieu’s field theory, such a mechanism can be compared to processes of capital reproduction in social fields, in the sense that actors who already have a higher level of capital may acquire more entrenched positions in the field structure.
3.2. Algorithms and Content Recommendation
While content ranking is primarily concerned with organizing the display order of content that users have previously accessed, recommender systems play a more active role in shaping users’ information experience. They suggest content that users may not necessarily have sought out, thereby shaping potential paths to content discovery. In other words, the algorithm in recommender systems is not only an intermediary between the user and the available content, but also plays a role in determining what type of content users encounter.
Studies of recommender systems show that these systems typically use a combination of user behavioral data, content similarities, and statistical prediction models to estimate the likelihood of users engaging with different types of content
. In such a framework, algorithms may give certain content or content creators more exposure. For example, Huszár et al.'s empirical research on Twitter's recommendation algorithm shows that these systems in some cases tend to amplify certain types of content or certain political leanings, a finding that suggests that algorithmic recommendation can play an important role in shaping patterns of visibility on platforms
| [18] | Huszár, F., Ktena, S. I., O'Brien, C., Belli, L., Schlaikjer, A., & Hardt, M. (2022). Algorithmic amplification of politics on Twitter. PNAS, 119(1), e2025334119.
https://doi.org/10.1073/pnas.2025334119 |
[18]
.
From the perspective of field theory, this process can be seen as a mechanism that regulates the paths of access to symbolic capital in the digital field. When algorithms recommend certain content to a wide range of users, this content has a greater opportunity to be seen and interacted with. Conversely, some content may receive less exposure, even without users directly evaluating it. As a result, recommender systems can indirectly contribute to shaping the distribution of attention and credibility in the digital field.
3.3. Algorithms and Visibility Management
Figure 2. Three mechanisms of algorithmic governance of visibility in the digital field and their implications for the distribution of symbolic capital.
In addition to content ranking and recommendation, many social platforms use other algorithmic mechanisms that directly or indirectly affect the visibility of content. These mechanisms can include boosting the publication of some content, limiting access to others, or gradually reducing its appearance in users' news feeds, without the content creator necessarily being aware of this process.
In the platform studies literature, these processes are often examined under the concept of “algorithmic management of visibility”
| [16] | Gillespie, T. (2019). Custodians of the Internet. Yale University Press, https://doi.org/10.12987/9780300235029 |
| [7] | Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press. |
[16, 7]
.
The empirical evidence in this area has expanded significantly in recent years, and studies from various platforms, from TikTok to Instagram, provide a coherent picture of algorithmic management of visibility. In a qualitative study of TikTok user experiences, Karizat et al. showed that many content creators experience a sudden drop in the visibility of their posts without receiving any formal notification from the platform, a phenomenon they refer to as “shadowbanning”
| [19] | Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in meaning-making on the platform. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2). https://doi.org/10.1145/3476046 |
[19]
.
In this regard, Cotter also shows in her analysis of the behavior of Instagram influencers that active users continuously adjust their strategies based on an implicit understanding of algorithmic restriction patterns
| [10] | Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913.
https://doi.org/10.1177/1461444818815684 |
[10]
; a finding that shows that visibility management is not merely a technical process, but a social reality that shapes the behavior of actors.
Furthermore, Haimson et al. documented in a study of transgender user-generated content on social platforms that some marginalized groups systematically face reduced visibility, evidence that reveals the structural dimensions of this process
| [17] | Haimson, O. L., Berner, A., Marwick, A., & Starks, B. (2021). Disproportionate removals and differing content moderation experiences for conservative, transgender, and black social media users. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2). https://doi.org/10.1145/3479610 |
[17]
. Within Bourdieu’s field theory, these mechanisms are part of the process of regulating actors’ access to symbolic capital. When algorithms amplify some content and restrict others, they rearrange the paths to attention and credibility in the digital field, and this rearrangement, despite its technical appearance, reflects the logics of power embedded in the platform structure. Visibility on social platforms cannot therefore be seen simply as the product of users’ free interaction, but as the result of a structural process in which the algorithm plays a decisive intermediary role.
As the diagram above shows, each mechanism leads to an unequal distribution of visibility, attention, credibility, and position in the digital field through different operational logics such as ranking, discovery, and access management. The “amplifies” and “reinforces” arrows between the mechanisms indicate the mutual reinforcement of these processes and the gradual accumulation of symbolic capital for dominant actors
| [7] | Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press. |
| [10] | Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913.
https://doi.org/10.1177/1461444818815684 |
| [19] | Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in meaning-making on the platform. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2). https://doi.org/10.1145/3476046 |
| [17] | Haimson, O. L., Berner, A., Marwick, A., & Starks, B. (2021). Disproportionate removals and differing content moderation experiences for conservative, transgender, and black social media users. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2). https://doi.org/10.1145/3479610 |
[7, 10, 19, 17]
.
4. Structural Actors: A Framework Beyond the Two Existing Approaches
4.1. Algorithms as a Structural Operator: Definition and Distinctions
Figure 3. Three theoretical positions on algorithms and their place on the structure-factor spectrum.
An examination of the two approaches, instrumental and autonomous, revealed that each suffers from theoretical blindness at a different point: the former fails to see the power structure, and the latter loses it in the anthropomorphism of technology. To overcome this dichotomy, this article proposes the concept of the "structural actor." The structural actor is a mechanism that is neither a human agent with independent intention and will, nor a completely neutral tool that simply carries out the orders of its designers; rather, it operates from within the structure of the digital field, implementing the rules of symbolic capital distribution, and at the same time plays a role in reproducing that same structure.
As
Figure 3 shows, the concept of structural actor occupies a distinct position on the structure-agency spectrum that distinguishes it from both poles, neither on the side of pure structural determinism nor on the side of independent agency, but in the dialectical space between the two. At one end of the spectrum, the tool-oriented approach considers the algorithm as a neutral executor of the designers’ instructions and denies it any independent structural role. At the other end of the spectrum, Latour’s actor-network theory (ANT) considers the algorithm as an intermediary that, like human actors, is located in a network of relationships and can translate and change
. The concept of structural actor is located in the middle of this spectrum, in the dialectic of structure and agency.
Understanding the analytical significance of the structural actor concept requires a careful explanation of its distinction from both existing approaches, as these distinctions create serious differences not only in terminology but also in their analytical and policy implications. By limiting the analysis to the intention and will of the designers, the instrumental approach ignores the structural effects of the algorithm on power relations, thereby diverting attention from the systematic inequalities that arise from the logic of the platform itself. By equating human and nonhuman agency, ANT ignores the unequal power structures within which the algorithm operates.
The concept of “infrastructural agency,” also used in the platform studies literature
| [27] | Star, S. L., & Ruhleder, K. (1996). Steps toward an ecology of infrastructure. Information Systems Research, 7(1), 111–134.
https://doi.org/10.1287/isre.7.1.111 |
| [28] | van Dijck, J., Poell, T., & de Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press, https://doi.org/10.23860/mgdr-2018-03-03-08 |
[27, 28]
, focuses mainly on the technical layer of the infrastructure and does not sufficiently address the dynamics of competition, the distribution of capital, and the logic of the field. In distinction from all three, structural agency neither equates algorithmic agency with human agency nor confines it solely to the technical layer; rather, it emphasizes that the algorithm operates within the logic of the digital field and thereby participates in the reproduction of power hierarchies.
This conception of the algorithm is deeply consistent with what Bourdieu describes about the relationship between structure and agency: just as human actors are neither completely subordinate to structure nor completely free from it, algorithms operate in a space between executing orders and shaping social reality. Algorithms are designed within the framework of the platform structure, but in their function of organizing the flow of information and distributing attention, they play a role beyond the mere execution of orders. Seaver describes this feature with the concept of “distributed agency,” an agency that is dispersed in a network of relationships between algorithms, data, users, and platform structures
.
As a result, the concept of the structural actor allows for a transition from the instrument/actor dichotomy; it sees the algorithm neither as a neutral instrument nor as a willed agent, but as a mechanism that implements the logic of the field from within and, in this implementation, shapes the distribution of symbolic capital, visibility, attention, and social prestige.
4.2. Conclusion, Limitations and Future Research Directions
Beyond its descriptive value, the concept of structural actor has important implications for the platform studies literature, the most direct of which is to redefine the question we ask when studying algorithms, not “What did the designers intend?” but “What logic does the digital field reproduce through these algorithms?” This shifts attention away from individual technical errors toward the power structures within which algorithms are designed and deployed.
However, scientific integrity requires that the limitations of this framework be stated as clearly as its claims are made. The first limitation relates to Bourdieu's framework itself; field theory was originally designed to analyze human actors, and its extension to nonhuman technology requires extensions that Bourdieu himself did not address, extensions that this article suggests are possible and useful but that future research should be tested more carefully.
The second limitation is that the concept of structural actor is presented at a theoretical-conceptual level and has not been tested with direct empirical data. The third limitation is that this framework mainly focuses on content-based platforms and its generalization to other platforms such as digital marketplaces or job search platforms requires independent research.
These limitations should not be seen as weaknesses of this proposal, but rather as avenues through which future research can test and extend the structural actor framework in more diverse empirical contexts. Comparative studies across different platforms, with mixed qualitative and quantitative methodologies, could illuminate the limits of the concept’s applicability in diverse digital fields. Structural actor is ultimately not a solution to the problem of algorithmic power, but a means to see it more clearly and, perhaps, to ask better questions about it.
Author Contributions
Fatemeh Nouri Dehnavi: Conceptualization, Methodology, Writing – original draft, Writing – review & editing
Conflicts of Interest
The author declares no conflicts of interest.
References
| [1] |
Abidin, C. (2021). Influencer fatigue. Social Media + Society, 7(1).
https://doi.org/10.1177/2056305121997567
|
| [2] |
Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(7), 985–997.
https://doi.org/10.1080/1369118X.2016.1216147
|
| [3] |
Bishop, S. (2020). Algorithmic experts: Selling algorithmic lore on YouTube. Social Media + Society, 6(1).
https://doi.org/10.1177/2056305119897323
|
| [4] |
Bourdieu, P. (1986). The forms of capital. In J. Richardson (Ed.), Handbook of theory and research for the sociology of education (pp. 241–258). Greenwood Press,
https://doi.org/10.4324/9780429494338
|
| [5] |
Bourdieu, P. (1993). The field of cultural production: Essays on art and literature. Columbia University Press,
https://doi.org/10.2307/431688
|
| [6] |
Brosch, A. (2021). Digital habitus and identity in social media. Journal of Youth Studies, 24(3), 312–327.
https://doi.org/10.1080/13676261.2020.1728239
|
| [7] |
Bucher, T. (2018). If...then: Algorithmic power and politics. Oxford University Press.
|
| [8] |
Callon, M. (1986). Some elements of a sociology of translation. In J. Law (Ed.), Power, action and belief (pp. 196–233). Routledge,
https://doi.org/10.1111/j.1467-954X.1984.tb00113.x
|
| [9] |
Caplan, R., & boyd, d. (2018). Isomorphism through algorithms: Institutional dependencies in the case of Facebook. Big Data & Society, 5(1).
https://doi.org/10.1177/2053951718757253
|
| [10] |
Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), 895–913.
https://doi.org/10.1177/1461444818815684
|
| [11] |
Cotter, K. (2021). "Shadowbanning is not a thing": Black box gaslighting and the power to independently know and credibly critique algorithms. Information, Communication & Society, 26(6), 1226–1243.
https://doi.org/10.1080/1369118X.2021.1994624
|
| [12] |
Couldry, N., & Hepp, A. (2017). The mediated construction of reality. Polity Press,
https://doi.org/10.1177/0163443717737614
|
| [13] |
Diakopoulos, N. (2016). Accountability in algorithmic decision making. Communications of the ACM, 59(2), 56–62.
https://doi.org/10.1145/2844110
|
| [14] |
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin's Press,
https://doi.org/10.1080/10999922.2018.1511671
|
| [15] |
Gillespie, T. (2014). The relevance of algorithms. Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press,
https://doi.org/10.7551/mitpress/9780262525374.003.0009
|
| [16] |
Gillespie, T. (2019). Custodians of the Internet. Yale University Press,
https://doi.org/10.12987/9780300235029
|
| [17] |
Haimson, O. L., Berner, A., Marwick, A., & Starks, B. (2021). Disproportionate removals and differing content moderation experiences for conservative, transgender, and black social media users. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2).
https://doi.org/10.1145/3479610
|
| [18] |
Huszár, F., Ktena, S. I., O'Brien, C., Belli, L., Schlaikjer, A., & Hardt, M. (2022). Algorithmic amplification of politics on Twitter. PNAS, 119(1), e2025334119.
https://doi.org/10.1073/pnas.2025334119
|
| [19] |
Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in meaning-making on the platform. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2).
https://doi.org/10.1145/3476046
|
| [20] |
Latour, B. (2013). Reassembling the social: An introduction to actor-network-theory. Oxford University Press,
https://doi.org/10.17323/1726-3247-2013-2-73-87
|
| [21] |
Mackenzie, A. (2019). Machine Learners: Archaeology of a Data Practice. Cultural Sociology,
https://doi.org/10.1177/1749975518808476
|
| [22] |
MacKenzie, D., & Wajcman, J. (Eds.). (1999). The social shaping of technology (2nd ed.). Open University Press,
https://researchonline.lse.ac.uk/id/eprint/28638/1/Introductory%20essay%20%28LSERO%29.pdf
|
| [23] |
Noble, S. U. (2025). Review of Algorithms of Oppression: How Search Engines Reinforce Racism, by Safiya Umoja Noble. Education Review,
https://doi.org/10.14507/er.v32.4117
|
| [24] |
Pasquale, F. (2016). The black box society: The secret algorithms that control money and information. Business Ethics Quarterly,
https://doi.org/10.1017/beq.2016.50
|
| [25] |
Roberts, S. T. (2019). Behind the screen: Content moderation in the shadows of social media. Yale University Press,
https://doi.org/10.1177/1461444819878844
|
| [26] |
Seaver, N. (2022). Computing taste: Algorithms and the makers of music recommendation. University of Chicago Pres,
https://press.uchicago.edu/ucp/books/book/chicago/C/bo183892298.html
|
| [27] |
Star, S. L., & Ruhleder, K. (1996). Steps toward an ecology of infrastructure. Information Systems Research, 7(1), 111–134.
https://doi.org/10.1287/isre.7.1.111
|
| [28] |
van Dijck, J., Poell, T., & de Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press,
https://doi.org/10.23860/mgdr-2018-03-03-08
|
| [29] |
Winner, L. (2017). Do artifacts have politics? Computer Ethics,
https://doi.org/10.5040/9798216385448.ch-17
|
| [30] |
Zuboff, S. (2020). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Catalan Journal of Communication & Cultural Studies,
https://doi.org/10.1386/cjcs_00020_5
|
Cite This Article
-
APA Style
Dehnavi, F. N. (2026). From Instrument to Structure: The Algorithm as a Structural Actor in the Digital Field. Humanities and Social Sciences, 14(2), 161-170. https://doi.org/10.11648/j.hss.20261402.22
Copy
|
Download
ACS Style
Dehnavi, F. N. From Instrument to Structure: The Algorithm as a Structural Actor in the Digital Field. Humanit. Soc. Sci. 2026, 14(2), 161-170. doi: 10.11648/j.hss.20261402.22
Copy
|
Download
AMA Style
Dehnavi FN. From Instrument to Structure: The Algorithm as a Structural Actor in the Digital Field. Humanit Soc Sci. 2026;14(2):161-170. doi: 10.11648/j.hss.20261402.22
Copy
|
Download
-
@article{10.11648/j.hss.20261402.22,
author = {Fatemeh Nouri Dehnavi},
title = {From Instrument to Structure: The Algorithm as a Structural Actor in the Digital Field},
journal = {Humanities and Social Sciences},
volume = {14},
number = {2},
pages = {161-170},
doi = {10.11648/j.hss.20261402.22},
url = {https://doi.org/10.11648/j.hss.20261402.22},
eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.hss.20261402.22},
abstract = {Billions of users interact with social platforms daily, but what each user sees, what content is displayed to them, and who is seen is not the product of free choice, but rather the result of algorithmic mechanisms that distribute power in these spaces. A development that has made the algorithm a central element in organizing the flow of information and the distribution of symbolic capital. The existing literature has mainly examined this phenomenon from two perspectives: the tool-oriented approach, which considers the algorithm as a technical and neutral mechanism, and the autonomous actor approach, which, inspired by Latour's actor-network theory, considers the algorithm as a counterpart agent to the human actor. This article argues that both approaches suffer from serious theoretical blind spots: the former ignores power structures, and the latter risks anthropomorphizing technology by equating human and non-human agency. Drawing on Bourdieu’s theory of the field, this research analyzes social platforms as digital fields in which actors compete for access to symbolic capital, namely visibility, attention, and prestige. This analysis shows that algorithms regulate the distribution of this capital through three mechanisms: ranking, recommendation, and visibility management. Accordingly, the article proposes the concept of the “structural actor,” a mechanism that is neither autonomous nor neutral, but rather operates within the logic of the field and participates in the reproduction of power relations. This concept adds a theoretical contribution to the literature on platform studies, shifting the central question from “What did the designers intend?” to “What logic does the digital field reproduce through algorithms?”.},
year = {2026}
}
Copy
|
Download
-
TY - JOUR
T1 - From Instrument to Structure: The Algorithm as a Structural Actor in the Digital Field
AU - Fatemeh Nouri Dehnavi
Y1 - 2026/04/30
PY - 2026
N1 - https://doi.org/10.11648/j.hss.20261402.22
DO - 10.11648/j.hss.20261402.22
T2 - Humanities and Social Sciences
JF - Humanities and Social Sciences
JO - Humanities and Social Sciences
SP - 161
EP - 170
PB - Science Publishing Group
SN - 2330-8184
UR - https://doi.org/10.11648/j.hss.20261402.22
AB - Billions of users interact with social platforms daily, but what each user sees, what content is displayed to them, and who is seen is not the product of free choice, but rather the result of algorithmic mechanisms that distribute power in these spaces. A development that has made the algorithm a central element in organizing the flow of information and the distribution of symbolic capital. The existing literature has mainly examined this phenomenon from two perspectives: the tool-oriented approach, which considers the algorithm as a technical and neutral mechanism, and the autonomous actor approach, which, inspired by Latour's actor-network theory, considers the algorithm as a counterpart agent to the human actor. This article argues that both approaches suffer from serious theoretical blind spots: the former ignores power structures, and the latter risks anthropomorphizing technology by equating human and non-human agency. Drawing on Bourdieu’s theory of the field, this research analyzes social platforms as digital fields in which actors compete for access to symbolic capital, namely visibility, attention, and prestige. This analysis shows that algorithms regulate the distribution of this capital through three mechanisms: ranking, recommendation, and visibility management. Accordingly, the article proposes the concept of the “structural actor,” a mechanism that is neither autonomous nor neutral, but rather operates within the logic of the field and participates in the reproduction of power relations. This concept adds a theoretical contribution to the literature on platform studies, shifting the central question from “What did the designers intend?” to “What logic does the digital field reproduce through algorithms?”.
VL - 14
IS - 2
ER -
Copy
|
Download