American Psychological Association 6th edition

Bian, Y., Somani, A., & Department of Electrical and Computer Engineering. (2025). CQS-Attention: Scaling Up the Standard Attention Computation for Infinitely Long Sequences. https://doi.Org/10.1109/ACCESS.2025.3544550.

ISO-690 (author-date, English)

BIAN, Yiming, SOMANI, Arun und DEPARTMENT OF ELECTRICAL AND COMPUTER ENGINEERING, 2025. CQS-Attention: Scaling Up the Standard Attention Computation for Infinitely Long Sequences. https://doi.org/10.1109/ACCESS.2025.3544550. 1 Januar 2025.

Modern Language Association 9th edition

Bian, Y., A. Somani, und Department of Electrical and Computer Engineering. „CQS-Attention: Scaling Up the Standard Attention Computation for Infinitely Long Sequences“. https://doi.Org/10.1109/ACCESS.2025.3544550, Januar 2025.

Mohr Siebeck - Recht (Deutsch - Österreich)

Bian, Yiming/Somani, Arun/Department of Electrical and Computer Engineering: CQS-Attention: Scaling Up the Standard Attention Computation for Infinitely Long Sequences, https://doi.org/10.1109/ACCESS.2025.3544550 2025,

Emerald - Harvard

Bian, Y., Somani, A. und Department of Electrical and Computer Engineering. (2025), „CQS-Attention: Scaling Up the Standard Attention Computation for Infinitely Long Sequences“, https://doi.Org/10.1109/ACCESS.2025.3544550.

Achtung: Diese Zitate sind unter Umständen nicht zu 100% korrekt.