1932

Abstract

This article makes the case that a powerful new discipline, which we term perception engineering, is steadily emerging. It follows from a progression of ideas that involve creating illusions, from historical paintings and film to modern video games and virtual reality. Rather than creating physical artifacts such as bridges, airplanes, or computers, perception engineers create illusory perceptual experiences. The scope is defined over any agent that interacts with the physical world, including both biological organisms (humans and animals) and engineered systems (robots and autonomous systems). The key idea is that an agent, called a producer, alters the environment with the intent to alter the perceptual experience of another agent, called a receiver. Most importantly, the article introduces a precise mathematical formulation of this process, based on the von Neumann–Morgenstern notion of information, to help scope and define the discipline. This formulation is then applied to the cases of engineered and biological agents, with discussion of its implications for existing fields such as virtual reality, robotics, and even social media. Finally, open challenges and opportunities for involvement are identified.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-control-062323-102456
2024-07-10
2024-12-14
Loading full text...

Full text loading...

/deliver/fulltext/control/7/1/annurev-control-062323-102456.html?itemId=/content/journals/10.1146/annurev-control-062323-102456&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Slater M, Banakou D, Beacco A, Gallego J, Macia-Varela F, Oliva R. 2022.. A separate reality: an update on place illusion and plausibility in virtual reality. . Front. Virtual Real. 3::81
    [Crossref] [Google Scholar]
  2. 2.
    von Sömmerring ST. 1796.. Über das Organ der Seele. Königsberg, Ger.:: Nicolovius
    [Google Scholar]
  3. 3.
    Prince S. 2010.. Through the looking glass: philosophical toys and digital visual effects. . Projections 4::1940
    [Crossref] [Google Scholar]
  4. 4.
    Naik H, Bastien R, Navab N, Couzin ID. 2020.. Animals in virtual environments. . IEEE Trans. Vis. Comput. Graph. 26:(5):207383
    [Crossref] [Google Scholar]
  5. 5.
    von Neumann J, Morgenstern O. 1944.. Theory of Games and Economic Behavior. Princeton, NJ:: Princeton Univ. Press
    [Google Scholar]
  6. 6.
    LaValle SM. 2006.. Planning Algorithms. Cambridge, UK:: Cambridge Univ. Press. Available at http://lavalle.pl/planning
    [Google Scholar]
  7. 7.
    Sakcak B, Weinstein V, LaValle SM. 2023.. The limits of learning and planning: minimal sufficient information transition systems. . In Algorithmic Foundations of Robotics XV, ed. SM LaValle, JM O'Kane, M Otte, D Sadigh, P Tokekar , pp. 25672. Berlin:: Springer
    [Google Scholar]
  8. 8.
    LaValle SM. 2012.. Sensing and filtering: a fresh perspective based on preimages and information spaces. . Found. Trends Robot. 1:(4):253372
    [Crossref] [Google Scholar]
  9. 9.
    Weinstein V, Sakcak B, LaValle SM. 2022.. An enactivist-inspired mathematical model of cognition. . Front. Neurorobot. 16::846982
    [Crossref] [Google Scholar]
  10. 10.
    Thrun S, Burgard W, Fox D. 2005.. Probabilistic Robotics. Cambridge, MA:: MIT Press
    [Google Scholar]
  11. 11.
    Skarbez R, Brooks FP, Whitton MC. 2017.. A survey of presence and related concepts. . ACM Comput. Surv. 50:(6):139
    [Crossref] [Google Scholar]
  12. 12.
    Hillis JM, Ernst MO, Banks MS, Landy MS. 2002.. Combining sensory information: mandatory fusion within, but not between, senses. . Science 298:(5098):162730
    [Crossref] [Google Scholar]
  13. 13.
    Fainekos GE, Girard A, Kress-Gazit H, Pappas GJ. 2009.. Temporal logic motion planning for dynamic mobile robots. . Automatica 45:(2):34352
    [Crossref] [Google Scholar]
  14. 14.
    Shin H, Kim D, Kwon Y, Kim Y. 2017.. Illusion and dazzle: adversarial optical channel exploits against lidars for automotive applications. . In Cryptographic Hardware and Embedded Systems—CHES 2017, ed. W Fischer, N Homma , pp. 44567. Cham, Switz.:: Springer
    [Google Scholar]
  15. 15.
    Suomalainen M, Nilles AQ, LaValle SM. 2020.. Virtual reality for robots. . In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1145865. Piscataway, NJ:: IEEE
    [Google Scholar]
  16. 16.
    Trippel T, Weisse O, Xu W, Honeyman P, Fu K. 2017.. WALNUT: waging doubt on the integrity of MEMS accelerometers with acoustic injection attacks. . In 2017 IEEE European Symposium on Security and Privacy, pp. 318. Piscataway, NJ:: IEEE
    [Google Scholar]
  17. 17.
    Shell DA, O'Kane JM. 2020.. Reality as a simulation of reality: robot illusions, fundamental limits, and a physical demonstration. . In 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 1032734. Piscataway, NJ:: IEEE
    [Google Scholar]
  18. 18.
    LaValle SM. 2023.. Virtual Reality. Cambridge, UK:: Cambridge Univ. Press
    [Google Scholar]
  19. 19.
    Friston K, Kilner J, Harrison L. 2006.. A free energy principle for the brain. . J. Physiol. Paris 100:(1–3):7087
    [Crossref] [Google Scholar]
  20. 20.
    Gregory RL. 1980.. Perceptions as hypotheses. . Philos. Trans. R. Soc. B 290:(1038):18197
    [Google Scholar]
  21. 21.
    Rao RPN, Ballard DH. 1999.. Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. . Nat. Neurosci. 2:(1):7987
    [Crossref] [Google Scholar]
  22. 22.
    Fechner GT. 1888.. Elemente der Psychophysik, Vol. 1. Leipzig, Ger.:: Breitkopf & Härtel
    [Google Scholar]
  23. 23.
    Kingdom FAA, Prins N. 2016.. Psychophysics: A Practical Introduction. London:: Academic
    [Google Scholar]
  24. 24.
    Treutwein B. 1995.. Minireview: adaptive psychophysical procedures. . Vis. Res. 35:(17):250322
    [Crossref] [Google Scholar]
  25. 25.
    Beck DM, Kastner S. 2009.. Top-down and bottom-up mechanisms in biasing competition in the human brain. . Vis. Res. 49:(10):115465
    [Crossref] [Google Scholar]
  26. 26.
    Meehan M, Insko B, Whitton M, Brooks FP Jr. 2002.. Physiological measures of presence in stressful virtual environments. . ACM Trans. Graph. 21:(3):64552
    [Crossref] [Google Scholar]
  27. 27.
    Jeung S, Hilton C, Berg T, Gehrke L, Gramann K. 2023.. Virtual reality for spatial navigation. . In Virtual Reality in Behavioral Neuroscience: New Insights and Methods, ed. C Maymon, G Grimshaw, YC Wu , pp. 10329. Berlin:: Springer
    [Google Scholar]
  28. 28.
    Lafer-Sousa R, Hermann KL, Conway BR. 2015.. Striking individual differences in color perception uncovered by `the dress' photograph. . Curr. Biol. 25:(13):R54546
    [Crossref] [Google Scholar]
  29. 29.
    Wertheimer M. 1912.. Experimentelle Studien über das Sehen von Bewegung [Experimental studies on the perception of motion]. . Z. Psychol. 61::161265
    [Google Scholar]
  30. 30.
    Gallier J. 2000.. Curves and Surfaces in Geometric Modeling. San Francisco:: Morgan Kaufmann
    [Google Scholar]
  31. 31.
    O'Regan J, Noë A. 2001.. A sensorimotor account of vision and visual consciousness. . Behav. Brain Sci. 24::93973
    [Crossref] [Google Scholar]
  32. 32.
    Gonzalez-Franco M, Lanier J. 2017.. Model of illusions and virtual reality. . Front. Psychol. 8::1125
    [Crossref] [Google Scholar]
  33. 33.
    Clark A. 2013.. Whatever next? Predictive brains, situated agents, and the future of cognitive science. . Behav. Brain Sci. 36:(3):181204
    [Crossref] [Google Scholar]
  34. 34.
    Walsh KS, McGovern DP, Clark A, O'Connell RG. 2020.. Evaluating the neurophysiological evidence for predictive processing as a model of perception. . Ann. N.Y. Acad. Sci. 1464:(1):24268
    [Crossref] [Google Scholar]
  35. 35.
    Bastos AM, Usrey WM, Adams RA, Mangun GR, Fries P, Friston KJ. 2012.. Canonical microcircuits for predictive coding. . Neuron 76:(4):695711
    [Crossref] [Google Scholar]
  36. 36.
    Shipp S, Adams RA, Friston KJ. 2013.. Reflections on agranular architecture: predictive coding in the motor cortex. . Trends Neurosci. 36:(12):70616
    [Crossref] [Google Scholar]
  37. 37.
    Skarbez R, Brooks FP, Whitton MC. 2020.. Immersion and coherence: research agenda and early results. . IEEE Trans. Vis. Comput. Graph. 27:(10):383950
    [Crossref] [Google Scholar]
  38. 38.
    Slater M. 2004.. How colorful was your day? Why questionnaires cannot assess presence in virtual environments. . Presence 13:(4):48493
    [Crossref] [Google Scholar]
  39. 39.
    Usoh M, Catena E, Arman S, Slater M. 2000.. Using presence questionnaires in reality. . Presence 9:(5):497503
    [Crossref] [Google Scholar]
  40. 40.
    Kennedy RS, Lane NE, Berbaum KS, Lilienthal MG. 1993.. Simulator sickness questionnaire: an enhanced method for quantifying simulator sickness. . Int. J. Aviat. Psychol. 3:(3):20320
    [Crossref] [Google Scholar]
  41. 41.
    Lawson BD. 2015.. Motion sickness symptomatology and origins. . In Handbook of Virtual Environments, ed. KS Hale, KM Stanney , pp. 531600. Boca Raton, FL:: CRC. , 2nd ed..
    [Google Scholar]
  42. 42.
    Stanney K, Lawson BD, Rokers B, Dennison M, Fidopiastis C, et al. 2020.. Identifying causes of and solutions for cybersickness in immersive technology: reformulation of a research and development agenda. . Int. J. Hum.-Comput. Interact. 36:(19):1783803
    [Crossref] [Google Scholar]
  43. 43.
    Oman CM. 1990.. Motion sickness: a synthesis and evaluation of the sensory conflict theory. . Can. J. Physiol. Pharmacol. 68:(2):294303
    [Crossref] [Google Scholar]
  44. 44.
    Hoffman DM, Girshick AR, Akeley K, Banks MS. 2008.. Vergence–accommodation conflicts hinder visual performance and cause visual fatigue. . J. Vis. 8:(3):3333
    [Crossref] [Google Scholar]
  45. 45.
    Keshavarz B, Hecht H, Lawson BD. 2015.. Visually induced motion sickness: causes, characteristics, and countermeasures. . In Handbook of Virtual Environments, ed. KS Hale, KM Stanney , pp. 64798. Boca Raton, FL:: CRC. , 2nd ed..
    [Google Scholar]
  46. 46.
    Mankowska ND, Marcinkowska AB, Waskow M, Sharma RI, Kot J, Winklewski PJ. 2021.. Critical flicker fusion frequency: a narrative review. . Medicina 57:(10):1096
    [Crossref] [Google Scholar]
  47. 47.
    Saygin AP, Chaminade T, Ishiguro H, Driver J, Frith C. 2012.. The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. . Soc. Cogn. Affect. Neurosci. 7:(4):41322
    [Crossref] [Google Scholar]
  48. 48.
    Levoy M, Whitaker R. 1990.. Gaze-directed volume rendering. . In I3D '90: Proceedings of the 1990 Symposium on Interactive 3D Graphics, pp. 21723. New York:: ACM
    [Google Scholar]
  49. 49.
    Denes G, Maruszczyk K, Ash G, Mantiuk RK. 2019.. Temporal resolution multiplexing: exploiting the limitations of spatio-temporal vision for more efficient VR rendering. . IEEE Trans. Vis. Comput. Graph. 25:(5):207282
    [Crossref] [Google Scholar]
  50. 50.
    Mark WR, McMillan L, Bishop G. 1997.. Post-rendering 3D warping. . In I3D '97: Proceedings of the 1997 Symposium on Interactive 3D graphics, pp. 716. New York:: ACM
    [Google Scholar]
  51. 51.
    Steinicke F, Bruder G, Jerald J, Frenz H, Lappe M. 2009.. Estimation of detection thresholds for redirected walking techniques. . IEEE Trans. Vis. Comput. Graph. 16:(1):1727
    [Crossref] [Google Scholar]
  52. 52.
    Bailenson JN, Beall AC, Loomis J, Blascovich J, Turk M. 2004.. Transformed social interaction: decoupling representation from behavior and form in collaborative virtual environments. . Presence 13:(4):42841
    [Crossref] [Google Scholar]
  53. 53.
    Rosa SO, Sovrano VA, Vallortigara G. 2014.. What can fish brains tell us about visual perception?. Front. Neural Circuits 8::119
    [Google Scholar]
  54. 54.
    Thurley K, Ayaz A. 2017.. Virtual reality systems for rodents. . Curr. Zool. 63:(1):10919
    [Crossref] [Google Scholar]
  55. 55.
    Larsch J, Baier H. 2018.. Biological motion as an innate perceptual mechanism driving social affiliation. . Curr. Biol. 28:(22):352332
    [Crossref] [Google Scholar]
  56. 56.
    Stowers JR, Hofbauer M, Bastien R, Griessner J, Higgins P, et al. 2017.. Virtual reality for freely moving animals. . Nat. Methods 14:(10):9951002
    [Crossref] [Google Scholar]
  57. 57.
    Aghajan ZM, Acharya L, Moore JJ, Cushman JD, Vuong C, Mehta MR. 2015.. Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. . Nat. Neurosci. 18:(1):12128
    [Crossref] [Google Scholar]
  58. 58.
    Sato N, Sakata H, Tanaka Y, Taira M. 2004.. Navigation in virtual environment by the macaque monkey. . Behav. Brain Res. 153:(1):28791
    [Crossref] [Google Scholar]
  59. 59.
    Cruz-Neira C, Sandin DJ, DeFanti TA, Kenyon RV, Hart JC. 1992.. The CAVE: audio visual experience automatic virtual environment. . Commun. ACM 35:(6):6472
    [Crossref] [Google Scholar]
  60. 60.
    Kunst M, Laurell E, Mokayes N, Kramer A, Kubo F, et al. 2019.. A cellular-resolution atlas of the larval zebrafish brain. . Neuron 103::2138
    [Crossref] [Google Scholar]
  61. 61.
    Trivedi CA, Bollmann JH. 2013.. Visually driven chaining of elementary swim patterns into a goal-directed motor sequence: a virtual reality study of zebrafish prey capture. . Front. Neural Circuits 7::86
    [Crossref] [Google Scholar]
  62. 62.
    Faumont S, Rondeau G, Thiele TR, Lawton KJ, McCormick KE, et al. 2011.. An image-free opto-mechanical system for creating virtual environments and imaging neuronal activity in freely moving Caenorhabditis elegans. . PLOS ONE 6:(9):e24666
    [Crossref] [Google Scholar]
  63. 63.
    Cook SJ, Jarrell TA, Brittin CA, Wang Y, Bloniarz AE, et al. 2019.. Whole-animal connectomes of both Caenorhabditis elegans sexes. . Nature 571:(7763):6371
    [Crossref] [Google Scholar]
  64. 64.
    Fearing R. 1991.. Control of a micro-organism as a prototype micro-robot. Paper presented at the 2nd International Symposium on Micromachines and Human Sciences, Nagoya, Jpn.:, Oct. 8–9
    [Google Scholar]
  65. 65.
    Fodor JA. 1986.. Why paramecia don't have mental representations. . Midwest Stud. Philos. 10::323
    [Crossref] [Google Scholar]
  66. 66.
    Erdmann MA, Mason MT. 1988.. An exploration of sensorless manipulation. . IEEE Trans. Robot. Autom. 4:(4):36979
    [Crossref] [Google Scholar]
  67. 67.
    Huzaifa M, Desai R, Grayson S, Jiang X, Jing Y, et al. 2022.. ILLIXR: an open testbed to enable extended reality systems research. . IEEE Micro 42:(4):97106
    [Crossref] [Google Scholar]
  68. 68.
    Gallagher S. 2017.. Enactivist Interventions: Rethinking the Mind. Oxford, UK:: Oxford Univ. Press
    [Google Scholar]
  69. 69.
    Hipólito I. 2022.. Cognition without neural representation: dynamics of a complex system. . Front. Psychol. 12::643276
    [Crossref] [Google Scholar]
  70. 70.
    Hutto DD, Myin E. 2012.. Radicalizing Enactivism: Basic Minds Without Content. Cambridge, MA:: MIT Press
    [Google Scholar]
/content/journals/10.1146/annurev-control-062323-102456
Loading
/content/journals/10.1146/annurev-control-062323-102456
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error