File compression is an issue very important today and necessary for transporting information in a manner more compact. Within the compression of information there is a variant for images that are addressed in this project, focusing on the extension.JPEG. The compression of these images are based on Huffman’s algorithm which is based on assigning different bits long codes to each of the characters in a file. If shorter codes are assigned to characters that appear most often a file compression is achieved. This document presents the theoretical development of the research protocol to develop the project of improvement of the quality of the Huffman algorithm, following several stages such as the approach to the problem, together with the objectives that have to meet either general and specific; boasts a hypothesis developed on the issue and the justification of the that is developing this project.

The content of this project based on the images.JPEG is what is You will be mentioning throughout the project. Files compressed or compacted handling has improved the use of large amounts of information in such a short space, but the compression of the information has not been adequate because it is still necessary to reduce the size of the images to be able to transport them, for this is necessary to make a compression algorithm and improve it. How to improve the quality of the images Huffman compression algorithm.JPEG? Reviewing the operation of the Huffman algorithm, using quality standards to make the necessary modifications and thus obtain a compression of images with lower weight; leaning on the programming for the same modification. The project will have a useful life until making another improved version, and will be designed for all those wishing to use the compression of images.

Sorry, comments are closed.