1932

Abstract

This article provides an overview of tensors, their properties, and their applications in statistics. Tensors, also known as multidimensional arrays, are generalizations of matrices to higher orders and are useful data representation architectures. We first review basic tensor concepts and decompositions, and then we elaborate traditional and recent applications of tensors in the fields of recommender systems and imaging analysis. We also illustrate tensors for network data and explore the relations among interacting units in a complex network system. Some canonical tensor computational algorithms and available software libraries are provided for various tensor decompositions. Future research directions, including tensors in deep learning, are also discussed.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-statistics-042720-020816
2021-03-07
2024-10-14
Loading full text...

Full text loading...

/deliver/fulltext/statistics/8/1/annurev-statistics-042720-020816.html?itemId=/content/journals/10.1146/annurev-statistics-042720-020816&mimeType=html&fmt=ahah

Literature Cited

  1. Adomavicius G, Tuzhilin A. 2011. Context-aware recommender systems. Recommender Systems Handbook F Ricci, L Rokach, B Shapira, PB Kantor217–53 New York: Springer
    [Google Scholar]
  2. Anandkumar A, Ge R, Janzamin M. 2017. Analyzing tensor power method dynamics in overcomplete regime. J. Mach. Learn. Res. 18:752–91
    [Google Scholar]
  3. Barak B, Moitra A. 2016. Noisy tensor completion via the sum-of-squares hierarchy. PMLR 49:417–45
    [Google Scholar]
  4. Baudat G, Anouar F. 2000. Generalized discriminant analysis using a kernel approach. Neural Comput. 12:2385–404
    [Google Scholar]
  5. Bi X, Qu A, Shen X. 2018. Multilayer tensor factorization with applications to recommender systems. Ann. Stat. 46:3308–33
    [Google Scholar]
  6. Bretto A. 2013. Hypergraph Theory: An Introduction. New York: Springer
    [Google Scholar]
  7. Candès EJ, Li X, Ma Y, Wright J. 2011. Robust principal component analysis?. J. ACM 58:1–37
    [Google Scholar]
  8. Candès EJ, Recht B. 2012. Exact matrix completion via convex optimization. Commun. ACM 55:111–19
    [Google Scholar]
  9. Cao X, Wei X, Han Y, Yang Y, Lin D. 2013. Robust tensor clustering with non-greedy maximization. Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence F Rossi1254–59 Menlo Park, CA: AAAI
    [Google Scholar]
  10. Carroll JD, Chang JJ. 1970. Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition. Psychometrika 35:283–319
    [Google Scholar]
  11. Carroll JD, Pruzansky S, Kruskal JB. 1980. Candelinc: a general approach to multidimensional analysis of many-way arrays with linear constraints on parameters. Psychometrika 45:3–24
    [Google Scholar]
  12. Chen B, He S, Li Z, Zhang S. 2012. Maximum block improvement and polynomial optimization. SIAM J. Optim. 22:87–107
    [Google Scholar]
  13. Cohen N, Sharir O, Shashua A. 2015. On the expressive power of deep learning: a tensor analysis. arXiv:1509.05009 [cs.NE]
    [Google Scholar]
  14. Cong F, Phan AH, Zhao Q, Huttunen-Scott T, Kaartinen J et al. 2012. Benefits of multi-domain feature of mismatch negativity extracted by non-negative tensor factorization from EEG collected by low-density array. Int. J. Neural Syst. 22:1250025
    [Google Scholar]
  15. Corsini P, Leoreanu V 2013. Applications of Hyperstructure Theory New York: Springer
    [Google Scholar]
  16. De Lathauwer L, De Moor B, Vandewalle J. 2000a. A multilinear singular value decomposition. SIAM J. Matrix Anal. Appl. 21:1253–78
    [Google Scholar]
  17. De Lathauwer L, De Moor B, Vandewalle J. 2000b. On the best rank-1 and rank-(r1, r2, …, rn) approximation of higher-order tensors. SIAM J. Matrix Anal. Appl. 21:1324–42
    [Google Scholar]
  18. Dunlavy DM, Kolda TG, Acar E 2011. Temporal link prediction using matrix and tensor factorizations. ACM Trans. Knowl. Discov. Data 5:101–27
    [Google Scholar]
  19. Eldén L, Savas B. 2009. A Newton–Grassmann method for computing the best multilinear rank-(r1, r2, r3) approximation of a tensor. SIAM J. Matrix Anal. Appl. 31:248–71
    [Google Scholar]
  20. Franz T, Schultz A, Sizov S, Staab S. 2009. TripleRank: ranking semantic web data by tensor decomposition. The Semantic Web—ISWC 2009 A Bernstein, DR Karger, T Health, L Feigenbaum, D Maynard et al.213–28 New York: Springer
    [Google Scholar]
  21. Gandy S, Recht B, Yamada I. 2011. Tensor completion and low-n-rank tensor recovery via convex optimization. Inverse Probl. 27:025010
    [Google Scholar]
  22. Ghoshdastidar D, Dukkipati A. 2014. Consistency of spectral partitioning of uniform hypergraphs under planted partition model. Advances in Neural Information Processing Systems 27 (NIPS 2014) Z Ghahramani, M Welling, C Cortes, ND Lawrence, KQ Weinberger397–405 Red Hook, NY: Curran
    [Google Scholar]
  23. Ghoshdastidar D, Dukkipati A. 2015. A provable generalized tensor spectral method for uniform hypergraph partitioning. PMLR 37:400–9
    [Google Scholar]
  24. Ghoshdastidar D, Dukkipati A. 2017. Consistency of spectral hypergraph partitioning under planted partition model. Ann. Stat. 45:289–315
    [Google Scholar]
  25. Guo W, Kotsia I, Patras I. 2012. Tensor learning for regression. IEEE Trans. Image Proc. 21:816–27
    [Google Scholar]
  26. Hao B, Zhang A, Cheng G. 2020. Sparse and low-rank tensor estimation via cubic sketchings. PMLR 108:1319–30
    [Google Scholar]
  27. Harshman RA. 1978. Models for analysis of asymmetrical relationships among n objects or stimuli. Presented at First Joint Meeting of the Psychometric Society and the Society of Mathematical Psychology, Hamilton, Ontario
    [Google Scholar]
  28. Harshman RA, Lundy ME. 1996. Uniqueness proof for a family of models sharing features of Tucker's three-mode factor analysis and PARAFAC/CANDECOMP. Psychometrika 61:133–54
    [Google Scholar]
  29. Hitchcock FL. 1927. The expression of a tensor or a polyadic as a sum of products. J. Math. Phys. 6:164–89
    [Google Scholar]
  30. Holland PW, Laskey KB, Leinhardt S. 1983. Stochastic blockmodels: first steps. Soc. Netw. 5:109–37
    [Google Scholar]
  31. Hutchinson B, Deng L, Yu D. 2013. Tensor deep stacking networks. IEEE Trans. Pattern Anal. Mach. Intell. 35:1944–57
    [Google Scholar]
  32. Jain P, Oh S. 2014. Provable tensor factorization with missing data. Advances in Neural Information Processing Systems 27 (NIPS 2014) Z Ghahramani, M Welling, C Cortes, ND Lawrence, KQ Weinberger1431–39 Red Hook, NY: Curran
    [Google Scholar]
  33. Jenatton R, Roux NL, Bordes A, Obozinski GR. 2012. A latent factor model for highly multi-relational data. Advances in Neural Information Processing Systems 25 (NIPS 2012) F Pereira, CJC Burges, L Bottou, KQ Weinberger3167–75 Red Hook, NY: Curran
    [Google Scholar]
  34. Karahan E, Rojas-Lopez PA, Bringas-Vega ML, Valdes-Hernandez PA, Valdes-Sosa PA. 2015. Tensor analysis and fusion of multimodal brain images. Proc. IEEE 103:1531–59
    [Google Scholar]
  35. Ke ZT, Shi F, Xia D. 2019. Community detection for hypergraph networks via regularized tensor power iteration. arXiv:1909.06503 [stat.ME]
    [Google Scholar]
  36. Kiers HAL. 2000. Towards a standardized notation and terminology in multiway analysis. J. Chemom. 14:105–22
    [Google Scholar]
  37. Kim C, Bandeira AS, Goemans MX. 2018. Stochastic block model for hypergraphs: statistical limits and a semidefinite programming approach. arXiv:1807.02884 [math.PR]
    [Google Scholar]
  38. Klamt S, Haus UU, Theis F. 2009. Hypergraphs and cellular networks. PLOS Comput. Biol. 5:e1000385
    [Google Scholar]
  39. Kolda TG. 2006. Multilinear operators for higher-order decompositions. Tech. Rep. SAND2006-2081, Sandia Natl. Lab., Albuquerque, NM
    [Google Scholar]
  40. Kolda TG, Bader BW. 2009. Tensor decompositions and applications. SIAM Rev. 51:455–500
    [Google Scholar]
  41. Kolda TG, Bader BW, Kenny JP. 2005. Higher-order web link analysis using multilinear algebra. Fifth IEEE International Conference on Data Mining New York: IEEE
    [Google Scholar]
  42. Koren Y. 2010. Collaborative filtering with temporal dynamics. Commun. ACM 53:89–97
    [Google Scholar]
  43. Kressner D, Steinlechner M, Vandereycken B. 2014. Low-rank tensor completion by Riemannian optimization. BIT Numer. Math. 54:447–68
    [Google Scholar]
  44. Krishnamurthy A, Singh A. 2013. Low-rank matrix and tensor completion via adaptive sampling. Advances in Neural Information Processing Systems 26 (NIPS 2013) CJC Burges, L Bottou, M Welling, Z Ghahramani, KQ Weinberger836–44 Red Hook, NY: Curran
    [Google Scholar]
  45. Kruskal JB. 1977. Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics. Linear Algebra Appl. 18:95–138
    [Google Scholar]
  46. Kruskal JB 1989. Rank, decomposition, and uniqueness for 3-way and N-way arrays. In Multiway Data Analysis, ed. R Coppi, S Bolasco, pp. 7–18 Amsterdam: North-Holland
    [Google Scholar]
  47. Li L, Zhang X. 2017. Parsimonious tensor response regression. J. Am. Stat. Assoc. 112:1131–46
    [Google Scholar]
  48. Li X, Xu D, Zhou H, Li L. 2018. Tucker tensor regression and neuroimaging analysis. Stat. Biosci. 10:520–45
    [Google Scholar]
  49. Li Z, Suk HI, Shen D, Li L. 2016. Sparse multi-response tensor regression for Alzheimer's disease study with multivariate clinical assessments. IEEE Trans. Med. Imaging 35:1927–36
    [Google Scholar]
  50. Lin CY, Chien IE, Wang IH. 2017. On the fundamental statistical limit of community detection in random hypergraphs. 2017 IEEE International Symposium on Information Theorypp2178–82 New York: IEEE
    [Google Scholar]
  51. Liu X, Sidiropoulos ND. 2001. Cramer-Rao lower bounds for low-rank decomposition of multidimensional arrays. IEEE Trans. Signal Proc. 49:2074–86
    [Google Scholar]
  52. Lock EF. 2018. Tensor-on-tensor regression. J. Comput. Gr. Stat. 27:638–47
    [Google Scholar]
  53. Lund K 2020. The tensor t-function: A definition for functions of third-order tensors. Numer. Linear Algebra Appl https://doi.org/10.1002/nla.2288
    [Crossref] [Google Scholar]
  54. Lyu T, Lock EF, Eberly LE. 2017. Discriminating sample groups with multi-way data. Biostatistics 18:434–50
    [Google Scholar]
  55. Madrid-Padilla OH, Scott J. 2017. Tensor decomposition with generalized lasso penalties. J. Comput. Gr. Stat. 26:537–46
    [Google Scholar]
  56. Mahyari AG, Zoltowski DM, Bernat EM, Aviyente S. 2016. A tensor decomposition-based approach for detecting dynamic network states from EEG. IEEE Trans. Biomed. Eng. 64:225–37
    [Google Scholar]
  57. Mao X, Chen SX, Wong RK. 2019. Matrix completion with covariate information. J. Am. Stat. Assoc. 114:198–210
    [Google Scholar]
  58. Mazumder R, Hastie T, Tibshirani R. 2010. Spectral regularization algorithms for learning large incomplete matrices. J. Mach. Learn. Res. 11:2287–322
    [Google Scholar]
  59. McCullagh P 1987. Tensor Methods in Statistics Boca Raton, FL: Chapman and Hall
    [Google Scholar]
  60. Miranda MF, Zhu H, Ibrahim JG. 2018. TPRM: tensor partition regression models with applications in imaging biomarker detection. Ann. Appl. Stat. 12:1422–50
    [Google Scholar]
  61. Mørup M, Hansen LK, Herrmann CS, Parnas J, Arnfred SM. 2006. Parallel factor analysis as an exploratory tool for wavelet transformed event-related EEG. NeuroImage 29:938–47
    [Google Scholar]
  62. Mu C, Huang B, Wright J, Goldfarb D. 2014. Square deal: lower bounds and improved relaxations for tensor recovery. Proceedings of the 31st International Conference on International Conference on Machine Learning EP Xing, T Jebara73–81 Brookline, MA: Microtome
    [Google Scholar]
  63. Nickel M, Tresp V, Kriegel HP. 2011. A three-way model for collective learning on multi-relational data. Proceedings of the 28th International Conference on International Conference on Machine Learning L Getoor, T Scheffer809–16 Madison, WI: Omnipress
    [Google Scholar]
  64. Novikov A, Podoprikhin D, Osokin A, Vetrov D. 2015. Tensorizing neural networks. Advances in Neural Information Processing Systems 28 (NIPS 2015) C Cortes, ND Lawrence, DD Lee, M Sugiyama, R Garnett442–50 Red Hook, NY: Curran
    [Google Scholar]
  65. Padovani C. 2000. On the derivative of some tensor-valued functions. J. Elast. 58:257–68
    [Google Scholar]
  66. Pan Y, Mai Q, Zhang X. 2019. Covariate-adjusted tensor classification in high dimensions. J. Am. Stat. Assoc. 114:1305–19
    [Google Scholar]
  67. Pearson KJ, Zhang T. 2014. On spectral hypergraph theory of the adjacency tensor. Gr. Comb. 30:1233–48
    [Google Scholar]
  68. Rabanser S, Shchur O, Günnemann S. 2017. Introduction to tensor decompositions and their applications in machine learning. arXiv:1711.10781 [stat.ML]
    [Google Scholar]
  69. Raskutti G, Yuan M, Chen H. 2019. Convex regularization for high-dimensional multiresponse tensor regression. Ann. Stat. 47:1554–84
    [Google Scholar]
  70. Rubin DB. 1976. Inference and missing data. Biometrika 63:581–92
    [Google Scholar]
  71. Shah P, Rao N, Tang G. 2015. Optimal low-rank tensor recovery from separable measurements: four contractions suffice. arXiv:1505.04085 [stat.ML]
    [Google Scholar]
  72. Sidiropoulos ND, Bro R. 2000. On the uniqueness of multilinear decomposition of n-way arrays. J. Chemom. 14:229–39
    [Google Scholar]
  73. Sun J, Tao D, Faloutsos C. 2006. Beyond streams and graphs: dynamic tensor analysis. Proceedings of the 12th ACM SIGKDD International Conference on Knowledge Discovery and Data Miningpp374–83 New York: ACM
    [Google Scholar]
  74. Sun J, Tao D, Papadimitriou S, Yu PS, Faloutsos C. 2008. Incremental tensor analysis: theory and applications. ACM Trans. Knowl. Discov. Data 2:1–37
    [Google Scholar]
  75. Sun WW, Li L. 2017. STORE: sparse tensor response regression and neuroimaging analysis. J. Mach. Learn. Res. 18:4908–44
    [Google Scholar]
  76. Sun WW, Li L. 2019. Dynamic tensor clustering. J. Am. Stat. Assoc. 114:1894–907
    [Google Scholar]
  77. Tang X, Bi X, Qu A. 2019. Individualized multilayer tensor learning with an application in imaging analysis. J. Am. Stat. Assoc. 115:836–51
    [Google Scholar]
  78. Tang X, Li L. 2020. Multivariate temporal point process regression. arXiv:2001.00719 [stat.ME]
    [Google Scholar]
  79. Tarzanagh DA, Michailidis G. 2019. Regularized and smooth double core tensor factorization for heterogeneous data. arXiv:1911.10454 [stat.ML]
    [Google Scholar]
  80. Ten Berge J, Sidiropoulos N. 2002. On uniqueness in CANDECOMP/PARAFAC. Psychometrika 67:399–409
    [Google Scholar]
  81. Tucker LR. 1966. Some mathematical notes on three-mode factor analysis. Psychometrika 31:279–311
    [Google Scholar]
  82. van den Berg E. 2019. The ocean tensor package. J. Open Res. Softw. 7:26
    [Google Scholar]
  83. Vasilescu MAO. 2002. Human motion signatures: analysis, synthesis, recognition. Proceedings of the 16th International Conference on Pattern Recognitionpp456–60 New York: IEEE
    [Google Scholar]
  84. Vasilescu MAO, Terzopoulos D. 2002. Multilinear analysis of image ensembles: TensorFaces. Computer Vision—ECCV 2002 A Heyden, G Sparr, M Nielsen, P Johansen447–60 New York: Springer
    [Google Scholar]
  85. Wang X, Donaldson R, Nell C, Gorniak P, Ester M, Bu J. 2016. Recommending groups to users using user-group engagement and time-dependent matrix factorization. Thirtieth AAAI Conference on Artificial Intelligencepp1331–37 Menlo Park, CA: AAAI
    [Google Scholar]
  86. Wimalawarne K, Tomioka R, Sugiyama M. 2016. Theoretical and experimental analyses of tensor-based regression and classification. Neural Comput. 28:686–715
    [Google Scholar]
  87. Xiong L, Chen X, Huang TK, Schneider J, Carbonell JG. 2010. Temporal collaborative filtering with Bayesian probabilistic tensor factorization. Proceedings of the 2010 SIAM International Conference on Data Mining S Parthasarathy, B Liu, B Goethals, J Pei, C Kamath211–22 Philadelphia: SIAM
    [Google Scholar]
  88. Yang Y, Dunson DB. 2016. Bayesian conditional tensor factorizations for high-dimensional classification. J. Am. Stat. Assoc. 111:656–69
    [Google Scholar]
  89. Yu HF, Rao N, Dhillon IS. 2016. Temporal regularized matrix factorization for high-dimensional time series prediction. Advances in Neural Information Processing Systems 29 (NIPS 2016) DD Lee, M Sugiyama, UV Luxburg, I Guyon, R Garnett847–55 Red Hook, NY: Curran
    [Google Scholar]
  90. Yuan M, Liu R, Feng Y, Shang Z. 2018. Testing community structures for hypergraphs. arXiv:1810.04617 [math.ST]
    [Google Scholar]
  91. Yuan M, Zhang CH. 2016. On tensor completion via nuclear norm minimization. Found. Comput. Math. 16:1031–68
    [Google Scholar]
  92. Yuan M, Zhang CH. 2017. Incoherent tensor norms and their applications in higher order tensor completion. IEEE Trans. Inform. Theory 63:6753–66
    [Google Scholar]
  93. Zhang A. 2019. Cross: efficient low-rank tensor completion. Ann. Stat. 47:936–64
    [Google Scholar]
  94. Zhang X, Li L. 2017. Tensor envelope partial least-squares regression. Technometrics 59:426–36
    [Google Scholar]
  95. Zhang Y, Bi X, Tang N, Qu A. 2020. Dynamic tensor recommender systems. arXiv:2003.05568 [stat.ME]
    [Google Scholar]
  96. Zhang Z, Allen GI, Zhu H, Dunson D. 2019. Tensor network factorizations: relationships between brain structural connectomes and traits. Neuroimage 197:330–43
    [Google Scholar]
  97. Zhou H, Li L, Zhu H. 2013. Tensor regression with applications in neuroimaging data analysis. J. Am. Stat. Assoc. 108:229–39
    [Google Scholar]
  98. Zhou S, Erfani SM, Bailey J. 2017. SCED: a general framework for sparse tensor decomposition with constraints and elementwise dynamic learning. 2017 IEEE International Conference on Data Miningpp675–84 New York: IEEE
    [Google Scholar]
  99. Zhu Y, Shen X, Ye C. 2016. Personalized prediction and sparsity pursuit in latent factor models. J. Am. Stat. Assoc. 111:241–52
    [Google Scholar]
/content/journals/10.1146/annurev-statistics-042720-020816
Loading
/content/journals/10.1146/annurev-statistics-042720-020816
Loading

Data & Media loading...

Supplementary Data

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error