An image is an optical representation of an object illuminated by a radiation source. A digital image is defined as the function of real variables having a precise amplitude (brightness, contrast, etc.) and coordinate position.
In electrical and computer engineering, image processing is a form of signal processing that uses an image as the input signal. The output of image processing can be a set of characteristics or an entire image. Image processing technology can be broadly classified into three categories:
Analog image processing is used in television broadcasting through dish antenna systems and is standard in older television models. Signals are received in analog form having the characteristics of phase shift and phase difference in analog image processing system.
Optical image processing techniques are used in microscopes and telescopes to visualize and enhance a specific object. A combination of lenses are used in specific orientation to perform precise operations.
Digital Image Processing
Digital image processing (DIP) involves treating an input image as a discrete signal and applying standard signal processing algorithms to perform specific operations. DIP is the most advanced branch of modern image processing technology. Generally speaking, the input signal is analyzed with the following operations:
Image Acquisition
The very first step of DIP is image acquisition where the input image is captured by a scanner or digital camera module. Image acquisition may also include some pre-processing techniques such as enhancement, color model conversions, etc. Enhancement techniques emphasize specific characteristics of the input image while color model conversions modify the colors of the image.
Segmentation
Segmentation is the process of partitioning the input image into integral parts. This is one of the most challenging parts of DIP. The segmentation procedure is accomplished by scanning the image pixel by pixel and quantifying each based on a specific threshold value.
Representation
The raw pixel data output from segmentation is used in the representation step. Analysis of corners and inflection is done using boundary representation and edge detection. Regional representation of the entire image is required to perform an analysis on textural or skeletal shapes.
Restoration
A mathematical, model-based restoration technique is used to enhance the quality of the processed image. Filters (Gaussian, Convolution, etc.) provide convenient methods of image restoration. A brilliant article by @terrylovejoy about the effective use of filters in astrophotography can be read here.
Output Evaluation
There are several image processing algorithms that can be used to evaluate the output image. The characteristics of the output can also be gathered for further analysis such as face detection or fingerprint recognition.
DIP System Components
A variety of components are required to make DIP systems functional. The fundamental elements of a DIP system are:
| 1. Image sensor | 4. Sufficiently powerful hardware |
| 2. Computational device | 5. Image processing software |
| 3. Display device | 6. Networking equipment |
The acquisition of digital images is commonly done with the help of a digital camera. The camera sensors generate electrical outputs that are converted into discrete signals. Computational devices are required to analyze the acquired data using specific algorithms. Specialized image processing hardware is often necessary to achieve a desired level of performance.
Software designed especially for image processing is a core component of DIP systems. OpenCV, ImageJ, Matlab, and eFilm are a few of the most commonly used image processing software packages. These packages support a variety of different programming languages like C, C++, Python, Java, etc. Matlab is widely used in electrical engineering while eFilm is popular for biomedical image analysis. Image displays and networking equipment are also essential components of a practical DIP system.
DIP Applications
Digital image processing can be applied to help solve problems in a wide range of diciplines. Researchers are continually trying to develop new algorithms to improve image processing. Some common applications of DIP include digital video processing, face detection, machine learning, biometric verification, remote sensing, computer vision, pattern and signature recognition, astrophotography, biomedical image analysis, and control robotics.
A detailed discussion on the applications of DIP will be provided in upcoming chapters of this series. Today, we will start by looking at a simple example of DIP.
Shape Identification using DIP in Matlab
The concept of digital image processing can easily be discussed with a practical example. As stated earlier, DIP can be done using any of the most widely used programming languages. Numerous software and processing techniques are used in DIP and there are dedicated software packages such as ‘Adobe Photoshop’ or 'GIMP' that can process digital images in a multitude of ways.
Matlab is a numerical computing environment for engineering and scientific research analysis that uses a matrix based high performance programming language known as "Matlab language". Digital image processing operations are performed in Matlab with the help of built-in functions. The objective of this image processing analysis is to identify the roundness of different shapes. This simple task will be demonstrated using Matlab. The following figure will be used as it contains several uniquely shaped objects.
Begin by changing the ‘current folder directory’ in Matlab to the local folder containing the sample image. Acquisition of the image is done using the imread function. In this case, the processed image is assigned the variable name ‘sample’. The sample image is then converted to grayscale with the help of rgb2gray function, as shown in Figure 3.
Next, the grayscale image is converted to black and white so that boundary representation can be applied. Objects smaller than a 50-pixel threshhold are ignored to reduce the noise.
Only the outer edges of the shapes are considered and each boundary is drawn within a matrix. A thin white colored boundary line is plotted around each object to highlight the boundary region. The threshold value is set as high as 0.98 to obtain an accurate identification.
The area and perimeter of each object is calculated using the respective formulas (provided below). A metric was formed to estimate roundness by determining the shape of the object relative to a perfect circle. The acquired data is then plotted to generate a digitally processed output image.
Figure 7 shows the output image labelled with the calculated value of roundness. The more circular objects have numbers closer to 1.0.
Conclusions
Digital image processing is becoming one of the most important branches of industrial production technology. The implementation of controlled robotics algorithms in DIP technology is helping to increase production efficiency. Biometric verification algorithms with improved accuracy are also being developed based on enhanced image processing.
From urban traffic control systems to astrophotography, image processing is ubiquitous in modern civilization. The major drawback of this technology at present is the limitation of image resolution quality. The viewing angle and the optics and hardware of the camera have a significant impact on the processed result. Ongoing advanced research is expected to further mitigate this hurdle with improved image acquisition technologies.
A wide range of software environments can be used to perform DIP. In this article, I used Matlab programming language to evaluate a fundamental operation of shape identification. Matlab R2017a was used to create this programming environment. The complete code of the program is provided below:
sample = imread('g5.png');
imshow(sample);
I = rgb2gray(sample);
bw = imbinarize(I);
imshow(bw)
bw = bwareaopen(bw,50);
imshow(label2rgb(L, @jet, [.5 .5 .5]))
hold on
for k = 1:length(B)
boundary = B{k};
plot(boundary(:,2), boundary(:,1), 'w', 'LineWidth', 2)
end
stats = regionprops(L,'Area','Centroid');
threshold = 0.98;
for k = 1:length(B)
boundary = B{k};
delta_sq = diff(boundary).^2;
perimeter = sum(sqrt(sum(delta_sq,2)));
area = stats(k).Area;
metric = 4*pi*area/perimeter^2;
metric_string = sprintf('%2.2f',metric);
text(boundary(1,2)-35,boundary(1,1)+13,metric_string,'Color','k','FontSize',18,'FontWeight','bold');
end
title(['Digitally processed output image for steemSTEM']);
More image processing algorithms using Matlab can be found here.
References
[1] D.S. Sohi, S.S. Devgan, “Application to enhance the teaching and understanding of basic image processing techniques”, IEEE Southeastcon 2000, iEEE Xplore, Accessed on 25th May, 2018.
[2] Yi Lei, Yabo Chen, “Optimization of Specific Instruction Set Processor for Image Algorithms”, Procedia Computer Science, V. 131, P. 182-191, ScienceDirect, Accessed on 25th May, 2018.
[3] M. Ekstrom, “Digital Image Processing Techniques”, Textbook by Elsevier, Accessed on 26th May, 2018.
[4] A. McAndrew, “An Introduction to Digital Image Processing with Matlab”, Notes for SCM2511 Image Processing 1, Course document of Victoria University of Technology, Accessed on 26th May, 2018.
[5] A. Mohan, S. Poobal, “Crack detection using image processing: A critical review and analysis”, Alexandria Engineering Journal, ScienceDirect, Accessed on 26th May, 2018.
[6] D N Kumar, “Image Processing Using MATLAB”, IISc, Bangalore, Web document, Accessed on 26th May, 2018.
[7] V. K.Pepper, C.Francom et al., “Objective characterization of airway dimensions using image processing”, International Journal of Pediatric Otorhinolaryngology, ScienceDirect, Accessed on 26th May, 2018.
Software Resources
Many thanks to @gra for the support in mentoring this article.
If you are interested in reading or writing Science, Technology, Engineering and Math related articles on steemit, feel free to join us in the steemSTEM community! Find more about steemSTEM from here, or visit the official discord server.