Identity, crimes, and law enforcement in the Metaverse – Humanities and Social Sciences Communications

Laws and research have explored measures to reach a mediation. They include restrictions on AI use depending on risks (European Parliament, 2024), compensatory measures, such as compensation funds and insurance schemes (Barbosa, 2024; Casado, 2024; Vellinga, 2024), and innovations in AI design, such as AIs that can explain themselves (Padovan et al., 2023) to diminish ambiguities on culpability or that can learn about moral behavior through rewards and punishments (Li et al., 2024; Noothigattu et al., 2019). To solve compensation issues, AIs could be trained to see financial loss as punishment and gain, as reward — much like humans. Though the definition of moral behavior for an AI could come with ambiguities, as seen in science fiction writer Isaac Asimov’s short story “Liar!” (Asimov, 2004), where a humanoid robot capable of conversation, similar to an AI-powered chatbot avatar, the Metaverse, is programmed such that it should not ‘harm’ humans. When a user asks the robot whether her crush loves her back, the robot gives an affirmative answer, regardless of the truth, because it perceives that another answer would harm the user emotionally. While this brings joy to the user in the short term, when she discovers the truth, she suffers different emotional harm. Disclosing the truth could prevent later disappointment, but it challenges AI capabilities in determining the truth (McIntosh et al., 2024) and has privacy implications (e.g., the AI would need the crush’s personal data). Alternatively, how experts respond in emotionally difficult situations could be studied (e.g., conversations about end-of-life care (Balaban, 2000)).

Two types of crimes could exist in the Metaverse: “cybercrimes”, acts already criminalized by many jurisdictions despite some nuances (Iu and Wong, 2023), and “fantasy crimes”, acts that have generally not been criminalized but whose physical-world counterparts generally are (e.g., virtual sexual assault). Their criminalization is contested mainly due to a historical focus on proving physically tangible harm. However, current debates suggest that, as we move toward a world that blends the physical with the virtual (Metaverse), a re-assessment of the costs and benefits of criminalizing fantasy crimes is needed, possibly with more weight on the psychological impact (section “Introduction).

To inform criminalization, this section provides insights into cybercrimes (section “Cybercrimes”) and fantasy crimes (section “Fantasy crimes) by covering existing cases, hypothetical scenarios, and different laws across jurisdictions. Existing cases and laws could provide insights into the costs to society in resolving differing views for an international framework. Though they need to be extended by hypothetical scenarios because the society we know now might be vastly different from our envisioned society integrated with the Metaverse. The predicted 2024 user penetration of essential Metaverse technologies (e.g., virtual world and blockchain asset use) is only 14.6% but is expected to reach 39.7% by 2030 (Statista, 2024b). This means that individuals who make up current statistics (e.g., empirical evidence on the effects of Metaverse-related technologies on behavior) have lived in an environment with little exposure to what would resemble the Metaverse. As a person’s environment can influence their concept of morality (Piaget, 2013; Smetana and Jambon, 2017), the moral reasoning behind current individuals’ behavior might not reflect that of individuals in our envisioned Metaverse world. For this reason, speculation and even divided opinions resulting from research (e.g., (Burkhardt and Lenhard, 2022; Coyne and Stockdale, 2021; Drummond et al., 2021; Ferguson et al., 2020; Malamuth, 2018) on the effects of violent media on violent behavior) could all be as relevant to the future — be seen as versions of the world. For instance, the less weight is placed on violence in the virtual, the more likely a person who has grown up in the Metaverse will suffer from identity confusion (section “Identity”).

Cybercrimes are generally defined as acts committed through information and communication technologies that violate the law of the physical world (Charlton, 2024; Sukhai, 2004; United Nations Office on Drugs and Crime, 2019). We consider potentially major cybercrimes.

In the virtual worlds and the Metaverse, individuals can create, own, modify, and exchange many virtual ‘thing’s, such as avatar accessory items, lands, artistic creations, cryptocurrencies (digital representations of value used for exchange through a decentralized digital ledger system, like blockchain (Huang et al., 2023)), other platform-specific exchange coins (e.g., Robux from Roblox (Roblox, 2023b)), etc. Similarly to their physical-world counterparts, these ‘thing’s can be damaged or stolen. For instance, people can vandalize virtual artworks with graffiti (BBC, 2017), causing permanent damage if their original data are overwritten with no backup (Seo et al., 2023). They can steal accessory items or platform currencies through scamming account information (BBC News, 2007) or hacking (Cavalli, 2008; JY Tan, 2024). While these ‘thing’s are virtual, many virtual worlds, like Roblox and Second Life, and cryptocurrency trading platforms (JY Tan, 2024) allow conversion with fiat currency, the medium of exchange (e.g., paper money or coins) (The Editors of Encyclopedia Britannica, 2024b) backed by a government that cannot be redeemed in gold or silver (The Editors of Encyclopedia Britannica, 2024c) (e.g., U.S. dollar, euro, or yuan). This can allow many to make a living by trading virtual ‘thing’s or services related to them (Brenner, 2008) (e.g., paid shifts in a virtual store in Roblox (Yandoli, 2024)).

A generally agreed upon criterion for taking legal actions against similar deprivation of a virtual ‘thing’ (e.g., Japan (Knight, 2005), the Netherlands (BBC News, 2007), and the United Kingdom (House of Commons Hansard, 2014)) is that the ‘thing’ needs to have ‘real’ monetary value (e.g., sold for “real cash” (Knight, 2005) or bought with “real money” (BBC News, 2007)). This could directly reflect an act’s impact on the stability of the physical world’s economic system (Seo et al., 2023). However, what qualifies as ‘real’ monetary value has been debated (Brenner, 2008; Wang, 2023). This led to cases where the deprivation of similar ‘thing’s (e.g., game accessories) was disregarded (BBC News, 2005; Cavalli, 2008).

To inform future debate, we illustrate how virtual ‘thing’s could be defined by law by considering the criteria that make a ‘thing’ a property — a ‘thing’ that can legally be owned but also has scarcity and value, in its use and/or its exchange (The Editors of Encyclopedia Britannica, 2024f). Scarcity implies that a ‘thing’ is limited in amount, not easily obtainable (Fairfield, 2022) in contrast to sunlight (Wang, 2023) or free, unlimited avatar accessories in existing game or Metaverse shops. Use value implies that the ‘thing’ can satisfy its owner’s “subjective needs” (e.g., happiness), while exchange value is generally considered to be its value in fiat currency. While some believe that a property needs to have exchange value, others believe that use value is enough in many contexts (e.g., Japan and Germany), even when it is only for certain groups (e.g., game accessory by players and artwork by artists) (Wang, 2023). Given that some virtual artworks are currently worth millions of USD (JY Tan, 2024) and some virtual accessories and currencies, thousands (Cavalli, 2008), this broader definition could prevent their potential economic impact from being disregarded. To prevent identity confusion (section “Crimes”), the categorization of physical-world property (e.g., land, accessories, artworks, etc.) can serve as inspiration, but virtual ‘thing’s’ unique characteristic in being easily duplicated through computational methods should be taken into consideration (Jiang, 2023; Wang, 2023).

While a physical-world object (e.g., one USD bill) cannot easily be increased in amount, in the virtual world, duplication could require only the changing of a single input value. This ease has implications for the proportionality of punishment (section “Identity of AI and other computer-controlled entities”). To illustrate, in an existing case, a defendant took advantage of a game platform’s system loophole to freely obtain game points worth 58,194 yuan (about 8135 USD) based on the trading pricing. While the court convicted the defendant for theft of such value, some consider the punishment disproportionate since the costs of distributing game points for the service provider are much less (Wang, 2023). In the Metaverse, treating a similar case based on the costs required to fix the hacking (Seo et al., 2023) or the trading price could affect recognition of a Metaverse economic system and thus integration of the virtual part of the Metaverse into the physical. As the more an individual is engaged with a virtual world, the more emotional value they give its virtual ‘thing’s (Strikwerda, 2012), making punishment for the deprivation of these ‘thing’s more severe could seem fairer to a growing amount of individuals with the growth of the virtual (section “Crimes”). It could also better reflect the loss of individuals who make their livings based on virtual ‘thing’s.

Money laundering is a process by which criminals transform money of illicit origins (e.g., drug trafficking (Teichmann, 2020)) into money of seemingly legal ones (The Editors of Encyclopedia Britannica, 2024d). Money laundering in the Metaverse and in current platforms can be seen in three stages. The first is Placement, where illicit money is introduced into a (Metaverse) market in the form of virtual ‘thing’s (e.g., cryptocurrencies). The second is Layering, where the criminals obscure the origins of the money by creating a false proof through a series of transactions (e.g., purchasing through shell companies (Seo et al., 2023)). The third is Integration, where the money of now seemingly legal origins is put back into the main economic system (e.g., by being exchanged into fiat currency) (Mooij, 2024a). Cryptocurrencies, viable options for Metaverse currencies (section “Virtual property crimes”), have been used to facilitate money laundering because they are less detectable through anonymity during creation and the ease of splitting them into small amounts, require lower creation and transaction costs and can speed up the usually tedious Layering stage due to the lack of intermediary and geographical bounds (Mooij, 2024a; United Nations, 2024). Money laundering not only allows criminals to finance their activities but also propagates them. While the estimated amount of money laundered through cryptocurrencies fluctuates ($9.9 billion in 2020, $18.3 billion in 2021, $31.5 billion in 2022, and $22.2 billion in 2023), possibly due to new methods to evade detection (Chainalysis Team, 2024), the United Nations Office on Drugs and Crime estimates that money laundered each year amounts to 2% to 5% of the global GDP (800 billion to 2 trillion USD) (United Nations Office on Drugs and Crime, 2024) — which could reflect the use of cryptocurrency or another Metaverse coin. In fact, a study eliciting views on Metaverse crimes from experts (Gómez-Quintero et al., 2024) suggests that money laundering would be a “highly achievable” crime with “high harm”.

The criminalization of money laundering, including through cryptocurrencies, is less contested, with many having taken legal actions (e.g., arrests or court cases from United Kingdom (Liz Jackson and PA Media, 2024), United States (Biase et al., 2024), Nigeria (Ogbonna, 2024), and China (Le, 2024), tax regulation helping victims by Algeria (Harff, 2024), and regulation to track transactions by Japan (Nikkei staff writers, 2022)). Though analyses of existing court cases, measures (Leuprecht et al., 2023; Mooij, 2024b; Thommandru and Chakka, 2023), and stakeholder opinions (Teichmann, 2020) suggest that the diversity of cryptocurrencies and regulatory measures could be a challenge for law enforcement efforts. This would require collaboration mainly with cryptocurrency market service providers and individuals (for private wallets) for investigation (e.g., the disclosure of transaction information (Reuters, 2022)) and standardization (e.g., standards for crime detection, such as requiring more personal information for a transaction at a standardized threshold (Mooij, 2024b)). Though this could challenge individuals’ privacy rights and hopes for a decentralized Metaverse. We discuss solutions for information disclosure in the section “Possible approaches”.

Taxes are amounts of money that people are required to pay by a government mainly for its services, such as law enforcement efforts (Cox et al., 2024). Tax violations are crimes related to taxes, such as tax evasion (i.e., avoiding tax payment (Kagan, 2024)) and tax fraud (i.e., intentional disclosure of false information (Chen, 2024)). Not enforcing taxation in the Metaverse (e.g., for income earned in virtual worlds) could facilitate violations for taxes of the physical world, similarly to the cryptocurrency market (Kim, 2023), justifying the need for Metaverse taxes.

This raises questions on how the Metaverse should be taxed since existing tax laws are mainly based on one’s relationship with different geographical jurisdictions (e.g., nationality, place of residence, or place of transaction). One way is for taxation to still be based on geographical jurisdictions. The main advantage is diminished costs in educating the public, already familiar with this concept. Though this would require physical-world identity information (e.g., IP address), which could easily be disguised (Kim, 2023). In fact, to combat tax evasion facilitated by anonymity, on March 2023, 48 countries agreed to follow transparency standards by exchanging information between each other and with cryptocurrency platforms (Singh, 2023). Though transparency could harm privacy, potentially discouraging participation in a Metaverse-like economy, as seen with a study on El Salvador (Alvarez et al., 2023), the first country to adopt a cryptocurrency as a currency. Moreover, taxation based on geographical jurisdictions could lead to the same type of transactions (e.g., in the same virtual world) being taxed differently across geographical jurisdictions. As inequity in amounts of taxes paid could lead to population migration (Kleven et al., 2020; Sandalci and Sandalci, 2021), this could destabilize the world with individuals with more sought-after skills becoming concentrated in specific areas. Taxation based on Metaverse jurisdictions (e.g., the virtual worlds; section “Rights and duties in Metaverse jurisdictions”) could ensure greater equity and be easier to track (Kim, 2023) while ensuring greater privacy. As the Metaverse and its jurisdictions could have different government services (e.g., technical maintenance), taxes could be applied to both, similarly to federal (country) and territorial (jurisdictions within) taxes (e.g., (Canada Revenue Agency, 2024)).

Either way, international collaboration would be needed for information sharing and standardization, including agreement on the currency or the currencies recognized by the Metaverse’s economic system. These could be fiat currencies, cryptocurrencies, or some new currencies. Despite the various opportunities for crimes that cryptocurrencies present, given their estimated market growth and their growing integration with our physical-world life (Statista, 2024a), ignoring them or banning them could be costly, not conductive to technological innovation, and even ineffective (Aquilina et al., 2024). Thus, the Metaverse would have to adopt existing cryptocurrencies or use some new currencies. If this Metaverse is decentralized, any new currencies could inherit cryptocurrencies’ dangers, such as their volatility, their tendency to rapidly change in prices. Volatility is claimed to be behind many countries’ reluctance in recognizing cryptocurrencies (e.g., China (BBC, 2021) and Qatar (The Peninsula Online, 2021)). From an accounting perspective, it leads to difficulties in assessing virtual ‘thing’s’ value (e.g., for taxation purposes), which could lead to economic instability (Huang et al., 2023), large investment losses (Baur and Dimpfl, 2021; Yermack, 2024), and mistrust (Rehman et al., 2020). Trust impacted people’s (lack of) acceptance of cryptocurrency as currency in El Salvador (Alvarez et al., 2023), potentially requiring more regulation (section “Ethical implications and other challenges during law enforcement”).

Identity theft is a form of impersonation where the culprit uses the victim’s personal information without their consent, possibly to commit fraud (Radin, 2024). Identity theft is a crime globally, with its protection service market expected to grow from 11.9 billion USD in 2024 to 28.1 billion USD in 2034 (Fact.MR, 2024). In the Metaverse, identity theft could occur with a culprit hacking into an individual’s account then pretending to be them (e.g., through their avatar (Cheong, 2022)) or collecting identifying information (e.g., avatar appearance and gestures) then forging an avatar or a similar representation (Deng et al., 2023) to impersonate service providers (e.g., doctors) for payments (e.g., false medical advice), law enforcement officers for information crucial to the Metaverse’s security (Gómez-Quintero et al., 2024), or someone’s loved one to scam them for ransom from kidnapping without actually kidnapping the loved one (Geldenhuys, 2023; Sudhakar and Shanthi, 2023). Despite the existence of various measures for prevention (e.g., periodical changing of avatar appearance (Falchuk et al., 2018), authentication based on both biometrics and digital information (Yang et al., 2023), or splitting of personal information across the Internet network (Cheong, 2022)) and for detection (e.g., algorithms that analyze potentially AI-generated content’s authenticity (Gupta et al., 2024)), they need to constantly evolve with technological advances and could be used to facilitate other crimes (Seo et al., 2023). As victims and perpetrators of the same cases can be across the world (Radin, 2024), international collaboration is needed for information sharing and technical standards (section “Ethical implications and other challenges during law enforcement”).

We explore major fantasy crimes.

Gambling is the staking of a valuable ‘thing’ on the outcome of an uncertain event while knowing the risks and hoping to gain something out of it. Gambling games include poker, horse betting, slot machines, and lottery (Glimne, 2024). Gambling can be done in the physical world or online (e.g., virtual world’s casino). Jurisdictions have varying levels of restrictions for (online) gambling, generally ranging from outright ban to allowing it under some age and game-type restrictions (The World Financial Review, 2022). Online gambling can impact individuals and society economically, through their physical-world finances and the ease of money laundering (Tomic, 2022), and psychologically, through the development of addiction (Brenner, 2008), aggravating negative economic impact in the long-term and a society’s mental health (López-Torres et al., 2021). Gambling that does not involve ‘real’ money (e.g., the exchange of stakes with fiat currencies), ‘simulated gambling” (Hing et al., 2022), has fewer legal restrictions, with even limited advice from the service provider (Hing et al., 2022). Though research has raised concerns, mainly about the transfer of simulated gambling habits to ‘real’ money gambling and other addictions (e.g., drugs), especially for younger individuals (Hing et al., 2022; Nower et al., 2022; Rahman et al., 2012). After a review of the research literature, the Albanese Government (Australia) decided to introduce age restrictions for simulated gambling starting from September 2024 (Ministers for the Department of Infrastructure, 2023). Similarly, the Consumer Council of Hong Kong has advocated for more legal restrictions in 2024 (Consumer Council, 2024). Similar to the case of media violence (section “Crimes”), the weight put on the harmful effects of simulated gambling versus individuals’ freedom to explore could reshape society.

Prostitution is the practice of engaging in sexual activity to obtain immediate payment (Jenkins, 2024). While prostitution occurring in the physical world is more or less restricted legally (Nicola, 2021), prostitution occurring in the virtual is even more of a gray area. In existing virtual worlds (e.g., Second Life (Cavalli, 2009; Dane, 2020)) and similarly in the Metaverse, avatars can simulate sexual activity (Bellini, 2024) through animated visuals and wearable devices for touch sensation (Evans, 2023)) and obtain payment (Brenner, 2008) (e.g., in the platform’s coins, which could be converted into fiat currency). When it does not involve a lack of consent (section “Virtual sexual assault”), a minor (i.e., someone not of consenting age), or someone looking like a minor (section “Sexual ageplay and child pornography”), the impact of this virtual prostitution could mainly be economical, as there are gains and losses of money, and psychological, due to embodiment (section “Identity”).

To inform future decisions, we consider the pros and cons. Research on the physical world suggests that recognizing prostitution could be beneficial to society by making current underground prostitution transactions eligible for taxation (Al Hazmi, 2024), facilitating regulation of already existing prostitution-related crimes (e.g., human trafficking) (Lee and Persson, 2022), allowing marginalized groups to make a living (Yasin and Namoco, 2021), and decreasing rape rates and sexual violence (Gao and Petrova, 2022). The current existence of virtual prostitution suggests a demand for it, which could lead to a need to regulate similar consequent crimes (e.g., tax violations and trafficking of avatars (Gómez-Quintero et al., 2024)). Moreover, compared to its physical-world counterpart, virtual prostitution does not have health risks like sexually transmitted diseases or physical injury (Brenner, 2008). On the other hand, research in the physical world suggests that legalizing prostitution could influence the public’s perception of the morality of “purchasing sex”, potentially reshaping the world by expanding the market, leading to discriminatory behavior, and encouraging more criminal activities (e.g., sex trafficking) (Raymond, 2004). As virtual prostitution seems inevitable, a possible mediation is to contain it to specific areas both legally and computationally, requiring an understanding of different cultures, economies, and laws.

Assault is an attempt at unlawfully using physical force against another (The Editors of Encyclopedia Britannica, 2024a). The digital version of acts harmful to humans physically cannot harm an avatar or any similar representation in the Metaverse unless it has been programmed (e.g., for violent video games) or hacked for this. When the original files are damaged, the harm done could warrant legal action (section “Virtual property crimes”). While the avatar is data that could ‘belong’ to one or several humans, it is also an extension of one’s identity, psychologically and possibly legally (section “Identity”). Its categorization as a person who can have their own rights and duties or something else, such as property, could affect the victim’s perception of fairness and others’ recognition of the Metaverse identity. This could, in turn, encourage or discourage ethical behavior (section “Crimes”). Across history, laws have treated the avatar’s categorization differently, mainly between person and property (Andrade, 2009), leaning more toward non-person categories currently (Oleksii et al., 2024). Though many scholars advocate for a possibly new categorization: a legal person as allocating separate rights and duties could reduce ambiguities (Cheong, 2022), a balance between property and person, such as a property with stronger legal protection, to take into account its personal, emotional connection with the human behind (Andrade, 2009), or a balance between property and identity through a combination of existing categories (Oleksii et al., 2024). We discuss how the assessment of physical harm (which the definition of assault is grounded in) could be reflected through the assessment of psychological harm in the section “Virtual sexual assault”.

Sexual assault is a form of assault involving some sexual conduct performed on a person without their consent (e.g., unwanted touching or sexual penetration) (Bellini, 2024). We consider sexual assault beyond the context of roleplay (i.e., consenting participants pretending). In the Metaverse, sexual assault could take the form of suggestive touching of the victim’s avatar, sexual acts on their avatar (e.g., section “Introduction”), or the hacking of the platform to make avatars perform sexual acts on each other (Brenner, 2008; Dibbell, 2005). When there is no permanent destruction of data (section “Virtual murder and physical assault leading to permanent damage of an avatar”), the main impact of assault is psychological. As the definition of assault requires physical contact (The Editors of Encyclopedia Britannica, 2024a), criminalization of virtual sexual assault is still debated (Bellini, 2024; Gómez-Quintero et al., 2024).

The concept of embodiment (section “Identity”) supports the similarity between psychological harm from acts in the virtual and that from the physical (as claimed in Camber (2024)’s case). For sexual acts specifically, research further supports the realism of sexual experiences through immersive technologies (Dekker et al., 2021), with the potential to reshape behaviors (Evans, 2023). The question is whether psychological harm can destabilize society enough to warrant legal action. Currently, the psychological impact of sexual assault (e.g., trauma) could lead to long-term effects, such as higher risks of engaging in other harmful behavior (e.g., hard drug use and contemplation of suicide), deteriorating relationships and work life, and a ripple effect with high costs to society (Bellini, 2024), with rape estimated to cost 122,461 USD across a victim’s lifetime (Peterson et al., 2017). Based on potential harm severity, frequency, the ease of achieving it, and the ease of defeating it, experts have assessed virtual sexual assault as one of the top ten Metaverse crime risks (Gómez-Quintero et al., 2024). With advances in the realism of immersive technologies, both visual and multisensory, the impact of sexual assault on the victim could worsen. Not criminalizing virtual sexual assault could support already existing victim-blaming, stereotypes, and discrimination (Dyar et al., 2021; Stubbs-Richardson et al., 2018) and even affect sexual assault occurring in the physical world due to identity confusion (section “Crimes”). In terms of prevention specific to the Metaverse, specific user setting features (section “Ethical implications and other challenges during law enforcement”) and restricted areas could be implemented (section “Prostitution”) to balance different stakeholders’ freedom. In line with legal tradition, the severity of punishment could be decided based on the psychological harm a “reasonable” person would suffer (Bellini, 2024), possibly with more recent population statistics. A similar line of reasoning could be applied to other acts where only psychological harm seems present (e.g., virtual physical assault outside the context of play).

Sexual ageplay legal in the physical world can be defined as consenting adults participating in sexual activity with one or several pretending to be a child or children. This is more controversial in the virtual since adults can make their avatars look different than what those of their chronological age usually look like. An example was Second Life, where some adult users engaged in sexual ageplay by donning avatars that look like children. Opinions among both users and the rest of the public (Brenner, 2008; Reeves, 2018) were divided. Proponents mainly argued that no ‘real’ child was hurt. Opponents mainly ranged from those who question the morality of sexual ageplay in general to those who question the visual depiction of children, ‘real’ or not, engaging in sexual acts. All worried about escalation to violence against ‘real’ children. This division is considered in line with the debate on virtual child pornography, entirely computer-generated sexually explicit graphics of “fictitious” children (Christensen et al., 2021). As both do not involve harm done to ‘real’ children, some lean toward protecting individuals’ freedom. In Ashcroft v. Free Speech Coalition (Ward, 2009), the Court highlighted the absence of causality between consuming virtual child pornography and the harm that could follow (Ratner, 2021). While research does suggest that virtual child pornography could lead to addiction then crimes against ‘real’ children (Christensen et al., 2021), it also supports that a ban could lead to less tolerance even for what is legal (e.g., ban of child-looking avatars in general) (Reeves, 2018), potentially limiting more freedom than expected in the Metaverse. With advances in graphics and AI and the proliferation of cases of AI-generated pornography across the world (Bae and Yeung, 2023), international collaboration on the (non-)criminalization of virtual child pornography has become a pressing issue.

Prior work (Brenner, 2008, pp. 95-96) mentions the hypothetical situation of the reconstruction of a Nazi death camp where users can roleplay as Nazis and inmates. While this would be illegal in many European countries, it would not be in the United States. An analysis (Cowan and Maitles, 2011) of simulations of sensitive historical events (e.g., Holocaust) and scenarios (e.g., racism) reveals opinions as divided — even when they were intended to increase empathy. Cons include perceived disrespect toward the victims due to lack of accuracy (Cowan and Maitles, 2011), identity confusion if such acts are popularized (section “Crimes”), and other potential impact on the mental health of those roleplaying and the audience, such as increased fear of future terrorism due to graphic media (Holman et al., 2020). A possible mediation with one’s freedom could be to raise awareness about potential harms but still allow simulations (e.g., for artistic expression and education) with stricter conditions (e.g., not in publicly accessible areas), requiring cultural and historical insights from different jurisdictions.

Our Experts


Daniel Michelson

Daniel is a long term investor and position trader in the forex market.

Reva Green

Reva Green is the Senior Editor for website. An experienced media professional, Reva has close to a decade of editorial experience with a background.

Shandor Brenner

Shandor Brenner, an experienced writer at fxaudit.com, brings a wealth of knowledge with over 20 years in the investment field.

Leave a Reply

CAPTCHA ImageChange Image