Header

UZH-Logo

Maintenance Infos

DESpace: spatially variable gene detection via differential expression testing of spatial clusters


Cai, Peiying; Robinson, Mark D; Tiberi, Simone (2024). DESpace: spatially variable gene detection via differential expression testing of spatial clusters. Bioinformatics, 40(2):btae027.

Abstract

Motivation: Spatially resolved transcriptomics (SRT) enables scientists to investigate spatial context of mRNA abundance, including identifying spatially variable genes (SVGs), i.e., genes whose expression varies across the tissue. Although several methods have been proposed for this task, native SVG tools cannot jointly model biological replicates, or identify the key areas of the tissue affected by spatial variability.
Results: Here, we introduce DESpace, a framework, based on an original application of existing methods, to discover SVGs. In particular, our approach inputs all types of SRT data, summarizes spatial information via spatial clusters, and identifies spatially variable genes by performing differential gene expression testing between clusters. Furthermore, our framework can identify (and test) the main cluster of the tissue affected by spatial variability; this allows scientists to investigate spatial expression changes in specific areas of interest. Additionally, DESpace enables joint modelling of multiple samples (i.e., biological replicates); compared to inference based on individual samples, this approach increases statistical power, and targets SVGs with consistent spatial patterns across replicates. Overall, in our benchmarks, DESpace displays good true positive rates, controls for false positive and false discovery rates, and is computationally efficient.
Availability and implementation: DESpace is freely distributed as a Bioconductor R package at https://bioconductor.org/packages/DESpace.

Abstract

Motivation: Spatially resolved transcriptomics (SRT) enables scientists to investigate spatial context of mRNA abundance, including identifying spatially variable genes (SVGs), i.e., genes whose expression varies across the tissue. Although several methods have been proposed for this task, native SVG tools cannot jointly model biological replicates, or identify the key areas of the tissue affected by spatial variability.
Results: Here, we introduce DESpace, a framework, based on an original application of existing methods, to discover SVGs. In particular, our approach inputs all types of SRT data, summarizes spatial information via spatial clusters, and identifies spatially variable genes by performing differential gene expression testing between clusters. Furthermore, our framework can identify (and test) the main cluster of the tissue affected by spatial variability; this allows scientists to investigate spatial expression changes in specific areas of interest. Additionally, DESpace enables joint modelling of multiple samples (i.e., biological replicates); compared to inference based on individual samples, this approach increases statistical power, and targets SVGs with consistent spatial patterns across replicates. Overall, in our benchmarks, DESpace displays good true positive rates, controls for false positive and false discovery rates, and is computationally efficient.
Availability and implementation: DESpace is freely distributed as a Bioconductor R package at https://bioconductor.org/packages/DESpace.

Statistics

Citations

Altmetrics

Downloads

6 downloads since deposited on 01 Feb 2024
6 downloads since 12 months
Detailed statistics

Additional indexing

Item Type:Journal Article, refereed, original work
Communities & Collections:07 Faculty of Science > Institute of Molecular Life Sciences
Dewey Decimal Classification:570 Life sciences; biology
Language:English
Date:1 February 2024
Deposited On:01 Feb 2024 16:02
Last Modified:30 Jun 2024 01:38
Publisher:Oxford University Press
ISSN:1367-4803
OA Status:Gold
Free access at:Publisher DOI. An embargo period may apply.
Publisher DOI:https://doi.org/10.1093/bioinformatics/btae027
PubMed ID:38243704
  • Content: Published Version
  • Language: English
  • Licence: Creative Commons: Attribution 4.0 International (CC BY 4.0)