Towards block-based compression of genomic data with random access functionality
- Author
- Tom Paridaens (UGent) , Yves Van Stappen, Wesley De Neve (UGent) , Peter Lambert (UGent) and Rik Van de Walle (UGent)
- Organization
- Abstract
- Current algorithms for compressing genomic data mostly focus on achieving high levels of effectiveness and reasonable levels of efficiency, ignoring the need for features such as random access and stream processing. Therefore, in this paper, we introduce a novel framework for compressing genomic data, with the aim of allowing for a better trade-off between effectiveness, efficiency and functionality. To that end, we draw upon concepts taken from the area of media data processing. In particular, we propose to compress genomic data as small blocks of data, using encoding tools that predict the nucleotides and that correct the prediction made by storing a residue. We also propose two techniques that facilitate random access. Our experimental results demonstrate that the compression effectiveness of the proposed approach is up to 1.91 bits per nucleotide, which is significantly better than binary encoding (3 bits per nucleotide) and Huffman coding (2.21 bits per nucleotide).
- Keywords
- Genomic Data Storage, DNA Sequence Compression, Random Access
Downloads
-
(...).pdf
- full text
- |
- UGent only
- |
- |
- 665.84 KB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-5821901
- MLA
- Paridaens, Tom, et al. “Towards Block-Based Compression of Genomic Data with Random Access Functionality.” 2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), IEEE, 2014, pp. 1360–63.
- APA
- Paridaens, T., Van Stappen, Y., De Neve, W., Lambert, P., & Van de Walle, R. (2014). Towards block-based compression of genomic data with random access functionality. 2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 1360–1363. New York, NY, USA: IEEE.
- Chicago author-date
- Paridaens, Tom, Yves Van Stappen, Wesley De Neve, Peter Lambert, and Rik Van de Walle. 2014. “Towards Block-Based Compression of Genomic Data with Random Access Functionality.” In 2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 1360–63. New York, NY, USA: IEEE.
- Chicago author-date (all authors)
- Paridaens, Tom, Yves Van Stappen, Wesley De Neve, Peter Lambert, and Rik Van de Walle. 2014. “Towards Block-Based Compression of Genomic Data with Random Access Functionality.” In 2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 1360–1363. New York, NY, USA: IEEE.
- Vancouver
- 1.Paridaens T, Van Stappen Y, De Neve W, Lambert P, Van de Walle R. Towards block-based compression of genomic data with random access functionality. In: 2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP). New York, NY, USA: IEEE; 2014. p. 1360–3.
- IEEE
- [1]T. Paridaens, Y. Van Stappen, W. De Neve, P. Lambert, and R. Van de Walle, “Towards block-based compression of genomic data with random access functionality,” in 2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), Atlanta, USA, 2014, pp. 1360–1363.
@inproceedings{5821901, abstract = {{Current algorithms for compressing genomic data mostly focus on achieving high levels of effectiveness and reasonable levels of efficiency, ignoring the need for features such as random access and stream processing. Therefore, in this paper, we introduce a novel framework for compressing genomic data, with the aim of allowing for a better trade-off between effectiveness, efficiency and functionality. To that end, we draw upon concepts taken from the area of media data processing. In particular, we propose to compress genomic data as small blocks of data, using encoding tools that predict the nucleotides and that correct the prediction made by storing a residue. We also propose two techniques that facilitate random access. Our experimental results demonstrate that the compression effectiveness of the proposed approach is up to 1.91 bits per nucleotide, which is significantly better than binary encoding (3 bits per nucleotide) and Huffman coding (2.21 bits per nucleotide).}}, author = {{Paridaens, Tom and Van Stappen, Yves and De Neve, Wesley and Lambert, Peter and Van de Walle, Rik}}, booktitle = {{2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP)}}, isbn = {{9781479970889}}, keywords = {{Genomic Data Storage,DNA Sequence Compression,Random Access}}, language = {{eng}}, location = {{Atlanta, USA}}, pages = {{1360--1363}}, publisher = {{IEEE}}, title = {{Towards block-based compression of genomic data with random access functionality}}, year = {{2014}}, }