IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD

05-04-2012 дата публикации
Номер:
US20120082394A1
Принадлежит: NOVATEK MICROELECTRONICS CORP.
Контакты:
Номер заявки: 55-67-1317
Дата заявки: 06-07-2011

CROSS-REFERENCE TO RELATED APPLICATION

[0001]

This application claims the priority benefit of Taiwan application serial no. 99133754, filed Oct. 4, 2010. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.

BACKGROUND OF THE INVENTION

[0002]

1. Field of the Invention

[0003]

The invention relates generally to a multimedia processing apparatus and a multimedia processing method. More particularly, the invention relates to an image processing apparatus and an image processing method.

[0004]

2. Description of Related Art

[0005]

As computing power rapidly advances in the recent years, digital mediums have become people's preferred tools for the expression of creativity and imagination. In particular, developments in digital image processing applications and related imaging products have allowed people to digitally capture and store the minute details of life.

[0006]

However, as digital images require a great volume of data, many multimedia storage or compression standards adopt the YUV420 color format in order to reduce the data volume of the digital images. When playing back the digital images, the image output device transforms the digital image data from the YUV420 color format to the YUV422 color format. In the YUV420 color format, vertical color information of the digital images is half of the original before down-sampling. Additionally, when the interlaced mode is used to generate digital images, a color line drop issue becomes more severe, especially on the vertical direction during high frequency color changes. The restored images exhibit definite color sawtooth patterns and may even exhibit the combing phenomenon.

[0007]

Moreover, when transforming the digital image data from the YUV420 color format to the YUV422 color format, the image output device typically employs a high grayscale filter. However, the high grayscale vertical filter requires a large volume of data storage, which seriously increases costs. On the other hand, if a low grayscale filter is used, then the digital images are comparatively blurry.

[0008]

The YUV420 color format transformation to the YUV422 color format under the MPEG (Motion Picture Experts Group) or other compression standards can be executed under the interlaced mode or the progressive mode. When decompression is performed, a front-end compression circuit flags a method employed during compression for a back-end image output device to perform decompression. According to the flag data, the image output device can reduce the visual side effects produced by the YUV420 color format.

[0009]

However, a portion of the front-end compression circuit may have inaccurately set the flag data. Consequently, when the back-end image output device performs decompression, an inferior decompression method is used to restore the YUV420 color format to the YUV422 color format, thereby producing even more severe visual side effects.

SUMMARY OF THE INVENTION

[0010]

An aspect of the invention provides an image processing apparatus employing a motion detection method to determine a relationship between a target image for restoration and a previous image and a next image thereof, so as to restore a color format and to effectively reduce visual side effects produced during compression.

[0011]

Another aspect of the invention provides an image processing method employing a motion detection method to determine a relationship between a target image for restoration and a previous image and a next image thereof, so as to restore a color format and to effectively reduce visual side effects produced during compression.

[0012]

An aspect of the invention provides an image processing apparatus, including an image detecting unit, an image interpolating unit, and an image blending unit. The image detecting unit detects a pixel difference value of an image frame and a previous image frame or a next image frame thereof and outputs a weight value according to the pixel difference value. The image interpolating unit interpolates a pixel value of the image frame in an inter-field interpolation method and an inter-field interpolation method. The image blending unit blends the pixel value interpolated in the intra-field interpolation method and the pixel value interpolated in the inter-field interpolation method according to the weight value, so as to restore the image frame.

[0013]

According to an embodiment of the invention, the image interpolation unit includes an intra-field interpolation unit and an intra-field interpolation unit. The intra-field interpolation unit interpolates the pixel value of the image frame in the intra-field interpolation method. The inter-field interpolation unit interpolates the pixel value of the image frame in the inter-field interpolation method.

[0014]

According to an embodiment of the invention, when the intra-field interpolation unit interpolates the pixel value of the image frame in the intra-field interpolation method, the pixel values of the adjacent pixel points near a target pixel point of the image frame are referenced to interpolate the pixel value of the target pixel point.

[0015]

According to an embodiment of the invention, when the inter-field interpolation unit interpolates the pixel value of the image frame in the inter-field interpolation method, the pixel value of the pixel point corresponding to a target pixel point on an odd field or an even field of the previous image frame, or the pixel value of the pixel point corresponding to the target pixel point on an odd field or an even field of the next image frame is referenced to interpolate the pixel value of the target pixel point.

[0016]

According to an embodiment of the invention, the image frame includes an odd field and an even field. The image detecting unit respectively compares the odd field and the even field of the image frame with an odd field and an even field of the previous image frame, or with an odd field and an even field of the next image frame, so as to obtain the pixel difference value.

[0017]

According to an embodiment of the invention, the pixel value of the image frame includes a grayscale value, a chroma value, or a luminance value.

[0018]

Another aspect of the invention provides an image processing method adapted for an image processing apparatus. The image processing method includes the following steps. An pixel difference value of an image frame and a previous image frame or a next image frame thereof is detected. A weight value is outputted according to the pixel difference value. A pixel value of the image frame is interpolated in an intra-field interpolation method and an inter-field interpolation method. The pixel value interpolated in the intra-field interpolation method and the pixel value interpolated in the inter-field interpolation method are blended according to the weight value, so as to restore the image frame.

[0019]

According to an embodiment of the invention, in the step of interpolating the pixel value of the image frame in the intra-field interpolation method and the inter-field interpolation method, when the pixel value of the image frame is interpolated in the intra-field interpolation method, the pixel values of the adjacent pixel points near a target pixel point of the image frame are referenced to interpolate the pixel value of the target pixel point.

[0020]

According to an embodiment of the invention, in the step of interpolating the pixel value of the image frame in the intra-field interpolation method and the inter-field interpolation method, when the pixel value of the image frame is interpolated in the inter-field interpolation method, the pixel value of the pixel point corresponding to the target pixel point on an odd field or an even field of the previous image frame, or the pixel value of the pixel point corresponding to the target pixel point on an odd field or an even field of the next image frame is referenced to interpolate the pixel value of the target pixel point.

[0021]

According to an embodiment of the invention, the image frame includes an odd field and an even field. Moreover, in the step of detecting the pixel difference value of the image frame and the previous image frame or the next image frame thereof, the odd field and the even field of the image frame are respectively compared with an odd field and an even field of the previous image frame, or with an odd field and an even field of the next image frame, so as to obtain the pixel difference value.

[0022]

In summary, according to embodiments of the invention, the image processing apparatus and the image processing method thereof employ the motion detection method to determine the pixel difference value of the target image for restoration and the previous image or the next image, and thereby determining the weight value when restoring the target image frame.

[0023]

It is to be understood that both the foregoing general descriptions and the following detailed embodiments are exemplary and are, together with the accompanying drawings, intended to provide further explanation of technical features and advantages of the invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0024]

The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.

[0025]

FIG. 1 is a schematic block diagram of an image processing apparatus according to an embodiment of the invention.

[0026]

FIG. 2 is a schematic view of a motion detection method according to an embodiment of the invention.

[0027]

FIG. 3 is a schematic graph illustrating a correlation between a pixel difference value and a weight value.

[0028]

FIG. 4 is a schematic view illustrating pixels in the intra-field interpolation method according to an embodiment of the invention.

[0029]

FIG. 5 is a flowchart illustrating the steps of an image processing method according to an embodiment of the invention.

DESCRIPTION OF EMBODIMENTS

[0030]

According to exemplary embodiments of the invention, when an image processing apparatus restores a color format of an image frame from YUV420 to YUV422, a grayscale value and a color information are referenced. At the same time, under the interlaced mode, the image processing apparatus considers an amount of a pixel motion so as to find an optimal reference point in the time domain. Thereafter, the image processing apparatus uses the reference point and a related weight value to interpolate a high color vertical resolution, while effectively reducing the visual side effects produced during compression.

[0031]

In the exemplary embodiments described hereafter, a chroma value of a target pixel point is interpolated, although the invention should not be construed as limited thereto.

[0032]

FIG. 1 is a schematic block diagram of an image processing apparatus according to an embodiment of the invention. Referring to FIG. 1, in the present embodiment, an image processing apparatus 100 includes an image detecting unit 110, an image interpolating unit 120, and an image blending unit 130. The image interpolating unit 120 includes an intra-field interpolation unit 122 and an inter-field interpolation unit 124.

[0033]

In the present embodiment, after the image detecting unit 110 receives an image signal S, a pixel difference value of a target image frame (e.g., the current image frame) and a previous image frame or a next image frame thereof are detected. Moreover, a weight value a is outputted to the image blending unit 130 according to the pixel difference value.

[0034]

On the other hand, the image interpolating unit 120 also receives the image signal S in order to interpolate the target image frame. In the present embodiment, the intra-field interpolation unit 122 interpolates a pixel value of the target image frame in an intra-field interpolation method and outputs a pixel value C_intra to the image blending unit 130 after interpolation. At the same time, the inter-field interpolation unit 124 interpolates a pixel value of the target image frame in an inter-field interpolation method and outputs a pixel value C_inter to the image blending unit 130 after interpolation.

[0035]

Thereafter, the image blending unit 130 blends the pixel values C_intra and C_inter interpolated by the image interpolating unit 120 according to the weight value α determined by the image detecting unit 110, so as to restore the image frame and output an image signal S′. The pixel values C_intra and C_inter interpolated by the image interpolating unit 120 are, for example, chroma values of the pixel points in the target image frame.

[0036]

Therefore, the image processing apparatus 100 according to the present embodiment does not rely on a flag provided by a front-end compression circuit to perform image restoration. Even if the front-end compression circuit fails to accurately set the flag data, the image processing apparatus 100 can still interpolate a high color vertical resolution, thereby effectively enhancing image output quality.

[0037]

More specifically, the image detecting unit 110 is, for example, a motion image detector using a motion detection method to determine the pixel difference value of the target image frame and the previous image frame or the next image frame thereof, and further to determine the weight value α referenced by the image blending unit 130 during the restoration of the image frame.

[0038]

FIG. 2 is a schematic view of a motion detection method according to an embodiment of the invention. Referring to FIGS. 1 and 2, in the present embodiment, the image compression is performed, for example, in an interlaced mode. Therefore, the image frames I1and I2received by the image detecting unit 110 respectively includes, for example, even fields f0and f2and odd fields f1and f3. Moreover, the symbol ◯ represents the grayscale value of the pixel point, the symbol  represents the chroma value of the pixel point, and the symbol represents the grayscale value of the target pixel point for interpolation in the present embodiment.

[0039]

In the interlaced mode, the even and odd fields respectively includes only the data from every other column in the original image frame. The even fields display the image signal of the even scan lines, whereas the odd fields display the image signal of the odd scan lines, and the two image signals are alternately displayed. Since the even and odd fields respectively includes only the data from every other row in the original image frame, therefore in a same field, the color information on the vertical direction is half of the original image frame. Moreover, adjacent pixel points commonly reference a chroma value. Therefore, when restoring the image frame, the image processing apparatus 100 needs to interpolate the chroma values of the pixel points which lack the chroma values.

[0040]

For example, in the even field f2, the pixel points P0and P2commonly reference the chroma value of the pixel point P0. Likewise, the pixel points P4and P6commonly reference the chroma value of the pixel point P4, and so on. Hence, the image interpolating unit 120 needs to interpolate the chroma values of the pixel points P2and P6in order to restore the image frame.

[0041]

In FIG. 2, when the interpolation target of the image interpolating unit 120 is the image frame I2, then with respect to time, the image frame I1is the previous image frame of the image frame I2. Conversely, when the interpolation target of the image interpolating unit 120 is the image frame I1, then with respect to time, the image frame I2is the next image frame of the image frame I1. In the present embodiment, the interpolation target of the image interpolating unit 120 is, for example, the pixel point P2(e.g., indicated by the symbol in FIG. 2) of the image frame I2. Therefore, the image frame I1is the previous image frame of the image frame I2.

[0042]

When objects on the previous image frame I1move, the pixel values of the corresponding pixel points on the following image frame I2also produce a pronounced changing difference value. For example, in the odd field f1, when the object corresponding to the location of the pixel point P3′ moves, a pronounced changing pixel difference value is exhibited by the grayscale value and the chroma value of the pixel point P3on the odd field f3compared with the grayscale value and chroma value of the pixel point P3′ (e.g., the grayscale or chroma difference values between the pixel points P3and P3′). Similarly, in the even field f0, when the object corresponding to the location of the pixel point P4′ moves, a pixel difference value is exhibited by the grayscale value and the chroma value of the pixel point P4on the even field f2compared with the grayscale value and chroma value of the pixel point P4′ (e.g., the grayscale or chroma difference values between the pixel points P4and P4′).

[0043]

Accordingly, in the present embodiment, when objects on the previous image frame I1move, the image detecting unit 110 exemplarily compares the pixel values of the pixel points P3′ and P3or P4′ and P4, so as to obtain the pixel difference value. In the present embodiment, the image detecting unit 110 compares the previous image frame and the target image frame, for example, although the invention is not limited thereto. In other embodiments of the invention, the image detecting unit 110 may also compare the next image frame and the target image frame, or simultaneously compare the previous image frame with the next image frame and the target image frame, so as to obtain the pixel difference value.

[0044]

Therefore, the image detecting unit 110 employs the afore-described motion detection method to determine whether objects on the image frame have moved, so as to determine the pixel difference value of the target image frame and the previous image frame or the next image frame thereof, and further to determine the weight value α referenced by the image blending unit 130 during restoration of the image frame.

[0045]

In other words, in order to obtain the pixel difference value, the image detecting unit respectively compares the odd fields and the even fields of the image frame with the odd fields and the even fields of the previous image frame, or with the odd fields and the even fields of the next image frame. After obtaining the pixel difference value, the image detecting unit produces the weight value in accordance with the pixel difference value and outputs the weight value to the image blending unit.

[0046]

FIG. 3 is a schematic graph illustrating a correlation between the pixel difference value and the weight value. Referring to FIGS. 1 to 3, in the present embodiment, after the image detecting unit 110 determines the pixel difference value by the afore-described motion detection method, the weight value α may be produced in accordance with the correlative graph in FIG. 3. Thereafter, the image blending unit 130 uses, for example, a proportional relationship of C_intra×α+C_inter×(1−α) to restore the image frame.

[0047]

For example, in motion images, the objects on the image frames typically exhibit pronounced changes, therefore a pixel difference value D1 detected by the image detecting unit 110 is comparatively large. According to FIG. 3, the pixel difference value D1 corresponds to, for example, α=1. Therefore, when the image blending unit 130 restores the image frame, a proportional relationship of C_intra×1+C_inter×0, for example, is used to restore the image frame. Hence, at this time the image blending unit 130 restores the image frame in accordance with the interpolation result of the intra-field interpolation unit 122.

[0048]

Moreover, in static images for example, the objects on the image frames typically do not exhibit major changes, therefore a pixel difference value D2 detected by the image detecting unit 110 is comparatively small. According to FIG. 3, the pixel difference value D2 corresponds to, for example, α=0. Therefore, when the image blending unit 130 restores the image frame, a proportional relationship of C_intra×0+C_inter×1, for example, is used to restore the image frame. Hence, at this time the image blending unit 130 restores the image frame in accordance with the interpolation result of the inter-field interpolation unit 124.

[0049]

Therefore, according to a degree of variation in the image frame, the corresponding a in FIG. 3 referenced by the image detecting unit 110 also varies in a corresponding degree. Hence, according to the embodiments of the invention, in order to restore the image frame, the image processing apparatus 100 may suitably adjust the proportional relationship between the pixel value C_intra interpolated in the intra-field interpolation method and the pixel value C_inter interpolated in the inter-field interpolation method according to the degree of variation in the image frame, without relying on a flag provided by the front-end compression circuit to perform image restoration. Therefore, even if the front-end compression circuit fails to accurately set the flag data, the image processing apparatus 100 can still interpolate a high color vertical resolution, thereby effectively enhancing image output quality.

[0050]

Taking the chroma values interpolated by the inter-field interpolation unit 124 for example, the chroma value of the target pixel point is interpolated in the inter-field interpolation method. Moreover, according to the present embodiment, the inter-field interpolation method employs the inter-field interpolation unit 124 to use the chroma values of a pixel at a same location on a previous image frame or a next image frame, so as to interpolate the chroma value of the target pixel point.

[0051]

In other words, when the inter-field interpolation unit 124 interpolates the pixel value of the image frame in the inter-field interpolation method, the pixel value of the pixel point corresponding to the target pixel point on the odd field or the even field of the previous image frame, or the pixel value of the pixel point corresponding to the target pixel point on the odd field or the even field of the next image frame is referenced to interpolate the pixel value of the target pixel point.

[0052]

Taking the chroma values interpolated by the intra-field interpolation unit 122 for example, the chroma value of the target pixel point is interpolated in the intra-field interpolation method. When the intra-field interpolation unit 122 interpolates the pixel value of the image frame in the intra-field interpolation method, the pixel values of the adjacent pixel points near the target pixel point of the image frame are referenced to interpolate the pixel value of the target pixel point.

[0053]

More specifically, FIG. 4 is a schematic view illustrating pixels in the intra-field interpolation method according to an embodiment of the invention. Referring to FIGS. 1-4, FIG. 4 illustrates in the even field f2of the image frame I2the target pixel point P2for interpolation by the intra-field interpolation unit 122, as well as 8 adjacent pixel points T, B, L, R, P1″, P3″, P4″, and P5″ nearby.

[0054]

In FIG. 4, the first row having the pixel points P1″, T, P3″ and the third row having the pixel points P4″, B, P5″ include grayscale values and chroma values thereof. On the other hand, the second row having the pixel points L, P2, and R include only the grayscale values thereof. Therefore, in the present embodiment, the intra-field interpolation unit 122 interpolates the chroma value of the target pixel point P2in the intra-field interpolation method detailed hereafter.

[0055]

Firstly, before interpolating the chroma value of the target pixel point P2, the intra-field interpolation unit 122 first confirms whether a grayscale difference value between the grayscale value of the pixel point P2and the grayscale values of the 8 adjacent pixel points is larger than a grayscale threshold value. For example, the intra-field interpolation unit 122 calculates a difference value between the grayscale value of the pixel point P2and an average value of the grayscale values of the 8 adjacent pixel points, compares with the grayscale threshold value, then takes the larger of the two as an effective grayscale threshold value.

[0056]

The afore-described judging method may be exemplarily depicted by a programmable code as follows:

[0000]


valid_th=max((yP2−(yP1″+yT+yP3″+yL+yR+yP4″+pB+yP5″)/8),coring_th)

[0057]

where valid_th is the effective grayscale threshold value, coring_th is the grayscale threshold value, and yP2, yP1″, yT, yP3″, yL, yR, yP4″, yB, yP5are the grayscale values of the pixel point P2and the 8 adjacent pixel points, respectively.

[0058]

Thereafter, the intra-field interpolation unit 122 determines a weight value ω of the pixel point T for interpolating the chroma value of the target pixel point P2according to a relationship between chroma values of the pixel points T, B, P1″, P3″, P4″, and P5″.

[0059]

For example, when the chroma value of the pixel point P1″ is closer to the chroma value of the pixel point T, then the pixel point T may receive two votes. Conversely, when the chroma value of the pixel point P1″ is closer to the chroma value of the pixel point B, then the pixel point B may receive two votes. When the chroma value of the pixel point P1″ is close to the chroma values of the pixel points T and B (e.g., with a difference less than the effective grayscale threshold value valid_th), then the pixel points T and B may each receive one vote. Similarly, the votes received by the pixel points T and B through the pixel points P3″, P4″, and P5″ may also be determined by the afore-described method.

[0060]

For example, assuming the chroma values of the pixel points T, B, P1″, P3″, P4″, and P5″ are respectively 100, 200, 120, 120, 150, and 150, then the votes received by the pixel point T through the pixel points P1″, P3″, P4″, and P5″ are respectively 2, 2, 1, and 1 (i.e., 6 total votes), for example. Moreover, the votes received by the pixel point B through the pixel points P1″, P3″, P4″, and P5″ are respectively 0, 0, 1, and 1 (i.e., 2 total votes), for example.

[0061]

Therefore, the intra-field interpolation unit 122 determines the weight value ω of the pixel point T for interpolating the chroma value of the target pixel point P2according to a ratio of the total votes of the pixel points T and B.

[0062]

The afore-described judging method may be exemplarily depicted by a programmable code as follows:

    • For (P″=P1″, P3″, P4″, P5″)
    • if (chomaP″, closer to chomaTthan chromaBby more than valid_th)
      • voteT+=2
    • else if (chomaP″ closer to chomaBthan chromaTby more than valid_th)
      • voteB+=2
    • else {voteT+=1, voteB+=1}

[0069]

where chomaP″ represents the chroma value of each of the pixel points (P1″, P3″, P4″, P5″), chomaTand chromaBare the respective chroma values of the pixel points T and B, and voteTand voteBare the votes received.

[0070]

After determining the weight value w of the pixel point T, the intra-field interpolation unit 122 interpolates the chroma value of the target pixel point P2in accordance with the following formula:

[0000]


x=ω×t+(1−ω)×b

[0071]

where x, t, and b are the chroma values of the pixel points P2, T, and B, respectively.

[0072]

Therefore, in the present embodiment, the intra-field interpolation unit 122 interpolates the chroma value of the target pixel point P2in the intra-field interpolation method detailed above.

[0073]

FIG. 5 is a flowchart illustrating the steps of an image processing method according to an embodiment of the invention. With reference to FIGS. 1 and 5, the image processing method of the present embodiment includes the following steps.

[0074]

In a Step S500, by using the image detecting unit 110, a pixel difference value of the target image frame and a previous image frame or a next image frame thereof is detected. Moreover, a weight value is outputted according to the pixel difference value.

[0075]

On the other hand, in a Step S502, by using the image interpolating unit 120, the pixel value of the image frame is interpolated in the intra-field interpolation method and the inter-field interpolation method.

[0076]

Thereafter, in a Step S504, by using the image blending unit 130, the pixel value interpolated in the intra-field interpolation method and the pixel value interpolated in the inter-field interpolation method are blended according to the weight value, so as to restore the image frame.

[0077]

It should be noted that, although in the present embodiment the Step S500 is depicted as being performed before the Step S502, the invention should not be construed as limited thereto. When interpolation is performed in practice, the Steps S500 and S502 may be concurrently executed.

[0078]

Moreover, the image processing method described in the present embodiment of the invention is sufficiently taught, suggested, and embodied in the embodiments illustrated in FIGS. 1-4, and therefore no further description is provided herein.

[0079]

In view of the foregoing, according to exemplary embodiments of the invention, the image processing apparatus and the image processing method thereof employ the motion detection method to determine the pixel difference value of the target image for restoration and the previous image or the next image, and thereby determining the weight value when restoring the target image frame. Therefore, the image processing apparatus according does not rely on the flag provided by the front-end compression circuit to perform image restoration. Moreover, even if the front-end compression circuit fails to accurately set the flag data, the image processing apparatus can still interpolate a high color vertical resolution, thereby effectively enhancing image output quality.

[0080]

It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.



An image processing apparatus including an image detecting unit, an image interpolating unit and an image blending unit is provided. The image detecting unit detects a pixel difference value of an image frame and a previous image frame or a next image frame thereof and outputs a weight value according to the pixel difference value. The image interpolating unit interpolates a pixel value of the image frame in an intra-field interpolation method and an inter-field interpolation method. The image blending unit blends the pixel value interpolated in the intra-field interpolation method and the pixel value interpolated in the inter-field interpolation method to restore the image frame according to the weight value. An image processing method is also provided.



1. An image processing apparatus, comprising:

an image detecting unit detecting a pixel difference value of an image frame and a previous image frame or a next image frame thereof and outputting a weight value according to the pixel difference value;

an image interpolating unit interpolating a pixel value of the image frame in an intra-field interpolation method and an inter-field interpolation method; and

an image blending unit blending the pixel value interpolated in the intra-field interpolation method and the pixel value interpolated in the inter-field interpolation method according to the weight value, so as to restore the image frame.

2. The image processing apparatus as claimed in claim 1, wherein the image interpolation unit comprises:

an intra-field interpolation unit interpolating the pixel value of the image frame in the intra-field interpolation method; and

an inter-field interpolation unit interpolating the pixel value of the image frame in the inter-field interpolation method.

3. The image processing apparatus as claimed in claim 2, wherein when the intra-field interpolation unit interpolates the pixel value of the image frame in the intra-field interpolation method, pixel values of adjacent pixel points near a target pixel point of the image frame are referenced to interpolate the pixel value of the target pixel point.

4. The image processing apparatus as claimed in claim 2, wherein when the inter-field interpolation unit interpolates the pixel value of the image frame in the inter-field interpolation method, a pixel value of a pixel point corresponding to a target pixel point on an odd field or an even field of the previous image frame, or a pixel value of a pixel point corresponding to the target pixel point on an odd field or an even field of the next image frame is referenced to interpolate the pixel value of the target pixel point.

5. The image processing apparatus as claimed in claim 1, wherein the image frame comprises an odd field and an even field, the image detecting unit respectively compares the odd field and the even field of the image frame with an odd field and an even field of the previous image frame, or with an odd field and an even field of the next image frame, so as to obtain the pixel difference value.

6. The image processing apparatus as claimed in claim 1, wherein the pixel value of the image frame comprises a grayscale value, a chroma value, or a luminance value.

7. An image processing method adapted for an image processing apparatus, the method comprising:

detecting an pixel difference value of an image frame and a previous image frame or a next image frame thereof;

outputting a weight value according to the pixel difference value;

interpolating a pixel value of the image frame in an intra-field interpolation method and an inter-field interpolation method; and

blending the pixel value interpolated in the intra-field interpolation method and the pixel value interpolated in the inter-field interpolation method according to the weight value, so as to restore the image frame.

8. The image processing method as claimed in claim 7, wherein in the step of interpolating the pixel value of the image frame in the intra-field interpolation method and the inter-field interpolation method, when the pixel value of the image frame is interpolated in the intra-field interpolation method, pixel values of adjacent pixel points near a target pixel point of the image frame are referenced to interpolate the pixel value of the target pixel point.

9. The image processing method as claimed in claim 7, wherein in the step of interpolating the pixel value of the image frame in the intra-field interpolation method and the inter-field interpolation method, when the pixel value of the image frame is interpolated in the inter-field interpolation method, a pixel value of a pixel point corresponding to a target pixel point on an odd field or an even field of the previous image frame, or a pixel value of a pixel point corresponding to the target pixel point on an odd field or an even field of the next image frame is referenced to interpolate the pixel value of the target pixel point.

10. The image processing method as claimed in claim 7, wherein the image frame comprises an odd field and an even field, and in the step of detecting the pixel difference value of the image frame and the previous image frame or the next image frame thereof, the odd field and the even field of the image frame are respectively compared with an odd field and an even field of the previous image frame, or with an odd field and an even field of the next image frame, so as to obtain the pixel difference value.

11. The image processing method as claimed in claim 7, wherein the pixel value of the image frame comprises a grayscale value, a chroma value, or a luminance value.