3D RENDERING METHOD AND DEVICE THEREOF

07-02-2017 дата публикации
Номер:
KR1020170013747A
Принадлежит:
Контакты:
Номер заявки: 01-15-102006829
Дата заявки: 28-07-2015

[1]

3 dimensional computer graphics techniques below description is are disclosed.

[2]

3 dimensional computer graphics (3D computer graphics) is unlike 2 dimensional computer graphic, geometric data model stored in the model using expression 2 treated with 3 dimensional computer graphic and outputs resultant dimension behind are disclosed. Wherein, each of the points that make up the model model geometric data including position information at the other. 3 dimensional computer graphics in the 3D rendering (rendering), 2D 3D model calculates Image light to transform a flow of real Image using specific technique or rendering (NPR, non-a photorealistic rendering) part by using the engaging one yarn layer are disclosed. Specifically, point (or, vertex (vertex)) to method points that make up the model rendering 3D relationship between information finally display color coordinates of a set of pixels are outputted by a goniophotometer determination process.

[3]

The one in the embodiment according to 3D rendering method, 3D model 3D model number 1 provided is based on the characteristic information determining vertex vertex among applied; said determining said number 1 vertices determined by performing shading effect [...] number 1; reference information indicative of whether said number 1 number 2 provided is applied [...] applied based on determining a pixel area; said determining said number 2 number 2 [...] determined pixel area by performing shading effect; and said number 1 shading effect and said number 2 shading effect can be produced based on the rendered result.

[4]

In one in the embodiment according to 3D rendering method, determining said number 1 provided is applied to the vertex, vertex between said 3D model projected screen area adjacent said number 1 provided is based on the edge length of the vertex determining vertex can be applied.

[5]

The one in the embodiment according to 3D rendering method, subdividing said 3D model is determining whether the surface; and said surface when said 3D model into a plurality of regions is partially surface can further include the step.

[6]

In one in the embodiment according to 3D rendering method, determining said number 1 provided is applied to the vertex, said 3D model vertex density information, said 3D model vertex comprised polygons area information, vertex distance information between projected screen area, projected screen area polygons area information based on the distance between the virtual light sources and at least one of vertex information determining said number 1 provided is vertex can be applied.

[7]

In one in the embodiment according to 3D rendering method, [...] said number 1, said 3D model vertex unit determines said number 1 shading effect, said number 2 [...], said 3D model is represented by a pixel-by-pixel video frame can determine an effect shading said number 2.

[8]

Rendering device is one in the embodiment according to 3D, 3D model characteristic information based on determining that determines a vertex among 3D model number 1 provided is vertex applied; said determining said number 1 vertices determined by performing shading effect [...] number 1 number 1 shader; [...] reference information indicative of whether said number 1 applied and a zone is determined based on the number 2 provided is to be applied, said determined pixel area by performing shading effect [...] determining said number 2 number 2 number 2 shader; and said number 1 shading effect and said number 2 shading effect based on the rendering result can be produced rendering Image generator.

[9]

In one in the embodiment according to 3D rendering device, said number 1 [...], projected screen area said 3D model vertex with respect to said number 1 provided is whether may send vertex attribute values are assigned can be performed.

[10]

In the embodiment according to 3D rendering device is one, which can be subdivided surface of said 3D model and determine whether, subdividing said surface when said 3D model into a plurality of regions is further includes a divider with the surface of the can.

[11]

In the embodiment according to 3D rendering device is other, number 1 and number 2 to be applied to determining that determines a shading Image frame currently shading shading type; [...] accordance with number 1 when said determined type, determining said current Image frame by performing shading effect [...] number 1 number 1 number 1 shader; [...] number 2 when said determined type, determining said current Image frame by performing shading effect [...] number 2 number 2 number 2 shader; and said number 1 shading effect or rendering result based on said number 2 shading effect said current Image frame can be produced rendering Image generator.

[12]

Figure 1 shows a configuration of the rendering device to explain the one in the embodiment according to 3D also are disclosed. Figure 2 shows a shading in the embodiment according to number 1 and number 2 [...] describe surface carrying out one also are disclosed. Figure 3 shows a number 1 number 2 [...][...] based on carrying out a subdividing surface also one in the embodiment according to describe surface are disclosed. Figure 4 shows a one in the embodiment according to 3D rendering method also to explain the operation of the flow are disclosed. Figure 5 shows a flow in the embodiment according to 3D rendering method to explain the operation of the other are disclosed. Figure 6 shows a rendering device in the embodiment according to 3D to explain the configuration of the other are disclosed.

[13]

Hereinafter, with reference to the attached drawing in the embodiment are detailed as follows. Specific structural functional disclosure below only for the purpose of obtaining that describes in the embodiment example, rights range limited to graces interpreted content described herein. If a person with skill in the art such from the substrate in the field described various modifications and deformable disclosed. The specification "one in the embodiment" or "in the embodiment" in respect to the mention is in the embodiment described specific features, structure or characteristic which means that at least one included in the embodiment, "one in the embodiment" or "in the embodiment" mention is that defines would graces will all have a same in the embodiment.

[14]

In addition, in the embodiment term used in particular in the embodiment used only to account for that, intending to be define in the embodiment is endured. It is apparent that a single representation of the differently in order not providing language translators, comprising plurality of representation. In the specification, the term "comprising" or "having disclosed" specification of articles feature, number, step, operation, components, parts or specify a combination not present included, another aspect of one or more moveable number, step, operation, component, component or a combination of these is understood to presence of or additionally pre-times those possibility should not number.

[15]

With reference to the attached drawing is connected to the described, drawing code is independent elements are the same references whereby, the local description dispensed the on-sensors other. In publicly known techniques are described in the embodiment description is specifically related to the subject matter of the analogy in the embodiment can be decided to be unnecessarily description if the V-shaped substrate.

[16]

Figure 1 shows a configuration of the rendering device to explain the one in the embodiment according to 3D also are disclosed. 3D rendering device (100) includes an input 3D model rendering (or, 3D rendering) 2D Image to Image rendering result outputs. 3D rendering device (100) includes a virtual space based on the position of a virtual camera model expressed 3D virtual light source Image can be composed of pixels for determining information about the rendering result. Wherein, given direction within a virtual space and the virtual light sources emit light, determines the time at which the virtual camera watching 3D model.

[17]

According to one in the embodiment, 3D rendering device (100) in place within the frame one of using different types of shading (shading) can be performed by combining an rendering. For example, 3D rendering device (100) for imaging frame of vertex (vertex) region is based on performing [...] number 1, number 2 can be based on other areas pixel (pixel) performing [...]. Wherein, the 3D vertex point (point) that make up the model exhibits. When rendering 3D [...] 3D model calculated based on direction and position of virtual light sources color surface model process are disclosed.

[18]

The reference also 1, 3D rendering device (100) for crystal (110), number 1 shader (120), number 2 shader (130) and rendering Image generator (140) having a predetermined wavelength.

[19]

Determinant (110) includes a projection (projection) 3D model is inputted screen area among the 3D model number 1 provided is of one or more vertex determining vertex can be applied. Wherein, 3D model vertex are projected screen area, 3D model rendering result Image presented on a vertex to vertex among arrangement exhibits.

[20]

Determinant (110) is applied to determine the number 1 provided is vertices, 3D model can be attribute information. 3D model 3D model geometry characteristics information includes characteristic information (geometry). 3D model characteristics information 3D model vertex information, location information of a virtual camera, a virtual camera direction information, location information and the virtual light sources of virtual light sources can be determined based on at least one of the direction information. Wherein, 3D model vertex of vertex information 3D coordinate, color, such as normal (normal) comprises vertex attribute (attribute) information.

[21]

According to one in the embodiment, determinant (110) includes a 3D model vertex density information, 3D model (vertices) vertex are comprised of polygon (polygon) area information, vertex distance information between projected screen area, polygons and the distance between the virtual light sources projected screen area using at least one of vertex information area information can be applied to determining number 1 provided is vertex. Wherein, poly it boiled 3D 3 in the model has a plurality of vertex comprised one of exhibits. In hereinafter, determinant (110) applied in the embodiment number 1 provided is determining each information describes the on-sensors other vertex. In the embodiment of the invention are described below for aiding understanding and, in the embodiment of the invention is defined in which range of below by interpreted and graces.

[22]

(1) that determines a (110) - 3D model density information is a case of using the vertex

[23]

Determinant (110) having the vertex to vertex is such that vertex into number 1 can be applied provided is determining. Vertex density information may send vertex on the dielectric can exhibit information. For example, the distance between the steam density of vertex vertex that indicates a relatively short, relatively long distance between the vertex of that density is low that vertex by a goniophotometer. Determinant (110) is adjacent vertex on the dielectric 3D model vertex edges (edge) length [...] number 1 can be applied provided is based on the determining. For example, determinant (110) is corresponding to the case of a shorter than lengths edge vertex is formed, number 1 provided is determined corresponding vertices vertex can be applied.

[24]

(2) determines a (110) - when using polygons area information is

[25]

Determinant (110) includes a 3D model can be preset is an area of polygons. If the former is smaller than the threshold value area polygons, determinant (110) which receives polygons can be applied provided is to determine vertex into vertex constituting number 1.

[26]

(3) determines a (110) - when using a projection screen space vertex on the dielectric information

[27]

Determinant (110) is projected screen area 3D model number 1 provided is based on the edge length between adjacent vertex vertex determining vertex can be applied. For example, determinant (110) is projected screen area adjacent one edge length of the longest edge length of the vertex vertex between preset threshold numbers determined number 1 provided is shorter than vertices vertex can be applied.

[28]

(4) that determines a (110) - when using a projection screen space area information polygons

[29]

Determinant (110) is projected screen area is an area of 3D model of polygons can be preset. Polygon area when projected onto a screen area smaller than a threshold value, determinant (110) which receives polygons can be applied provided is to determine vertex into vertex constituting number 1.

[30]

(5) determinant (110) when the distance between the vertex and virtual light sources - information

[31]

Determinant (110) is projected screen area 3D model calculates a distance between the vertex and virtual light sources, computed number 1 provided is smaller than a threshold value preset numbers vertices can be applied vertex is determined.

[32]

According to another in the embodiment, determinant (110) includes a 3D model characteristic information as well as 3D model rendering Institute (e.g., FPS (frame per second)) can be applied provided is based on determining vertex number 1. For example, Institute for rendering 3D model reference when slower than, that determines a (110) is projected screen area 3D model number 1 provided is one by which the ratio of vertex vertex can be applied. As another alternative, when 3D model rendering Institute reference rate faster than, that determines a (110) is projected screen area ratio can be applied reduce 3D model number 1 provided is one vertex of vertex. Or, if determined to be in an Institute rendering sufficiently fast, determinant (110) to capture the Image quality rendering result despite the processed complexity than number 1 number 2 number 1 provided is not provided is only performed determines that vertex is shading can be applied.

[33]

According to another in the embodiment, determinant (110) includes a divider (150) can be a. Divider (150) is 3D model subdividing the surface (surface subdivision) is available for determining, for each surface refinement is possible by performing surface refinement can be divided into a plurality of regions 3D model surface. For example, vertex projected screen area between edge length long than a threshold value, when determined that the corresponding vertex number 1 provided is not to be applied to home as follows. Divider (150) to create a new vertex which receives an internal polygon based on the vertex can be determine whether. When creating a new vertex, divider (150) is further divided into a plurality of regions corresponding polygons new vertex polygon inside the can. Determinant (110) through the reconstructed 3D model number 1 provided is designed to divert the subdividing vertex with respect to determining vertex can be applied.

[34]

Number 1 shader (120) included in accordance with input unit (125) for crystal (110) decides the number 1 to number 1 by performing vertex determined by [...] shading effect. Shading input unit (125) is determined through interpolation (interpolation) based on number 1 [...] perform vertex shading of values, interpolation result number 2 shader (130) be capable of delivering. Number 2 shader (130) included in accordance with input unit (135) is applied [...] reference information indicative of whether the number 1 to be applied provided is can be a zone is determined based on the number 2. Shading input unit (135) [...] decides the number 2 number 2 is determined pixel area by performing shading effect. Wherein, [...][...] number 1 number 2 relatively to processing speed, the quality of the Image number 1 number 2 [...][...] result of the Image is a rendering result can be superior.

[35]

One of the projected screen area 3D model number 1 provided is not performed information about a vertex to vertex number 1 shader (120) (pass though) through to number 2 shader (130) can be delivered. And, number 1 shader (120) a shading effect determined by number 1 number 2 shader (130) through rendering Image generator (140) can be delivered.

[36]

According to one in the embodiment, number 1 shader (120) and number 2 shader (130) 3 dimensional computer graphics rendering pipeline in each vertex shader (vertex shader) and corresponding pixel shader (pixel shader) (rendering pipeline) can be. For example, number 1 shader (120) includes a shading effect per vertex basis determining 3D model - perform vertex shading (per-a vertex Shading, PVS), number 2 shader (130) 3D model is represented video frame pixel-by-pixel shading effect per - determining pixel shading (per-a pixel Shading, PPS) can be performed.

[37]

PVS 3D model is derived for each vertex shading type calculated by shading, vertex shading of values calculated in such a way that a resultant rendering result Image interpolation pixel values are disclosed. PPS is calculated by rendering result being rendered video frame pixel values per pixel shading type in such a way that Image are disclosed. In that make up the model may send vertex of information to be PVS 3D performing s402. blue color filter. For example, the number of rendering speed 3D model surface constituting the surface of the vertex can comprise low faster but, 3D model sophisticated representation is difficult to be coated. Calculating the pixel-by-pixel shading effect PPS, generally PVS better than conventional Image quality rendering result formed by resolution of video rendering but excessive calculation amount can be rendered due to a former state.

[38]

Number 1 shader (120) is projected screen area 3D model vertex with respect to whether may send vertex attribute values can be performed provided is number 1, corresponding vertex attribute value can be determined based on the reference information. For example, number 1 shader (120) is a "1" which indicates that the number 1 provided is performed vertices flag (flag) number 1 provided is performed can be assigning a value for, number 1 provided is "0" flag can be assigning a value for vertex is not performed. The flag information is assigned number 2 shader (130) of the first level, the entire pixels of video frame by interpolating reference information can be determined. Image frame from above and 530 is provided with a flag value is assigned, on the "0" and "1" set flag values intermediate between the interpolated value is assigned a value equal to. Number 2 shader (130) is, e.g., Image frames to a color reference information of pixels among pixels [...] performing preset number 2 can be smaller than a threshold value.

[39]

According to another in the embodiment, number 2 shader (130) is performed based on the number 2 provided is 3D rendering model to be applied can be a zone is determined. For example, number 2 shader (130) if rendering is to enlarge the area to be applied provided is rate faster than Institute reference number 2 and, if reference number 2 provided is slower than Institute rendering to be applied region and to reduce a disclosed.

[40]

Rendering Image generator (140) number 1 shader (120) a shading effect determined by number 1 and number 2 shader (130) a shading effect 3D model based on RGB rendering result Image number 2 determined by a natural number.

[41]

3D rendering device (100) is performed within a frame based on the velocity one Image properties or rendering 3D model using different types of number 1 and number 2 mixed [...] shading by rendering result Image is performing rendering at a high speed while minimizing can be. For example, 3D rendering device (100) according one as an Image quality deterioration by 3D model prevails in an area that does capable of processing applying the faster [...] number 1, number 2 is a tunneling effect to display a picture quality is excellent in quality by applying [...] region while minimizing degradation can be performing rendering at a high speed.

[42]

According to another in the embodiment, 3D rendering device (100) for every frame number 1 or number 2 determine whether [...][...] whether imaging is applied can be applied. Determinant (110) 3D model is based on the number 1 and number 2 or 3D model rendering Institute such as vertex shading can be determining shading type to be applied to the Image frame currently shading. For example, when rendering Institute an etch rate is slower, determinant (110) which receives an Image frame calculating unit determines that performing vertex shading effect can be [...] number 1. When rendering Institute sufficiently fast, determinant (110) which receives an Image frame number 2 [...] determines that performing calculating pixel-by-pixel shading effect can be. When [...] determined shading type number 1, number 1 shader (120) includes a current Image frame by applying [...] number 1 can be determine an effect shading number 1. When [...] determined shading type number 2, number 2 shader (130) includes a current Image frame by applying [...] number 2 can determine an effect shading number 2. Rendering Image generator (140) includes a shading effect or rendering result based on the current Image frame number 1 number 2 shading effect can be produced. 3D rendering device (100) for imaging is required rendering effect modulator according different shading scheme by selectively utilizing 3D rendering of the first deterioration in picture quality can be performed.

[43]

Figure 2 shows a one in the embodiment according to number 1 number 2 [...][...] also describe carrying out a surface are disclosed.

[44]

The reference 2 also, projected screen area 3D model (210) of the vertex are 6 (222, 224, 226, 228, 230, 232) comprising assuming that the other. 3D rendering device is projected screen area 3D model (210) vertices are (222, 224, 226, 228, 230, 232) can perform in determining [...] vertex number 1. For example, 3D rendering device is 3D model (210) vertex and its content of formed adjacent one of the vertex edge (edge) length among the longest edge length of the finding, the longest edge length of the corresponding vertices can be performing predetermined threshold back hereinafter [...] number 1. Wherein, vertex between edge length may send vertex (222) to vertex (224) between edge length (240) vertex (226) and vertex (232) between edge length (250) adjacent exhibits such as vertex on the dielectric.

[45]

Threshold the edge lengths (240, 242, 244, 246, 248) are greater than the edge length (250, 252, 254) determined than assuming, vertex are 3D rendering device is (222, 224) can be subjected to perform [...] number 1. 3D rendering device is greater than a threshold value vertex are vertex formed from edge length (226, 228, 230, 232) determines that the number 1 provided is subjected to be unsuitable. 3D rendering device is vertex are (226, 228, 230, 232) are vertex (226, 228, 230, 232) so that the pixels can be performed based on the number of [...] number 2.

[46]

Baseline (260) is 3D model (210) expressed a baseline considering the convex Image frame number 2 provided is performed by a goniophotometer. 3D rendering device is number 1 provided is vertex are performed (222, 224) which indicates that the number 1 provided is for determining for the reference information can be performed. For example, number 1 provided is 3D rendering device is vertex are performed (222, 224) is assigning a value "1" flag, number 1 provided is not vertex are performed (226, 228, 230, 232) is "0" can flag values are assigned. Each vertex are (222, 224, 226, 228, 230, 232) flag value is assigned to all the pixels are interpolated video frame reference information can be determined. E.g., "1" flag value is assigned vertex (224) flag value "0" on the vertex (226) "0" and "1" located between the mid-value between pixels interpolation by having flag value to be coated. 3D rendering device is, for example, video frame 0 among all the pixels. 5 hereinafter flag signal with the pixels can be performing [...] number 2.

[47]

3D rendering device is 3D model (210) based on information such as edge length of geometrical characteristic provides a number 1 number 2 provided is applied provided is excellent quality region of the area to be separately applied while minimizing at a faster speed than the component selector determining 3D model (210) that relate to the disclosed.

[48]

Figure 3 shows a number 1 number 2 [...][...] based on carrying out a subdividing surface also one in the embodiment according to describe surface are disclosed.

[49]

In Figure 3 is, 3D rendering device is projected screen area 3D model (210) vertices are (222, 224, 226, 228, 230, 232) during vertex are (222, 224) number 1 provided is determined assuming that the vertex be applied to a substrate. 3D vertex are other rendering device (226, 228, 230, 232) are formed by the surface of the 3D model (polygon) (310, 320) is to determine whether the surface and subdividing, for each surface refinement is possible by performing surface subdividing 3D model (210) can be reconstructed.

[50]

Subdividing surface through 3D model (210) surfaces of (310, 320) and when the result is shown on the right of Figure 3 is divided into 3 regions be. 3D rendering device is, for example, surfaces (310, 320) of a new vertex are inside of center of gravity position or any (330, 340) by adding surface (310, 320) can be divided into a plurality of region. 3D rendering device is reconstructed 3D determining system for adapting a number 1 provided is vertex can be applied. Subdividing vertex through surface (222, 224, 226, 228, 230) vertex number 1 provided is determined to be applied assumed to have occurred as follows. 3D rendering device is vertex (222, 224, 226, 228, 230) to perform [...] number 1, reference number 1 provided is similarly performed in Figure 2 on whether based on information can be [...] of performing pixel number 2. The contents of Figure 2 for the related description reference can. Baseline (350) is performed provided is convex baseline considering the number 2, number 2 provided is performed according to the determined surface subdividing s402. pixel region.

[51]

Figure 4 shows a one in the embodiment according to 3D rendering method also to explain the operation of the flow are disclosed. The one in the embodiment according to 3D rendering method including 3D rendering device can be performed by at least one processor.

[52]

Step (410) in, 3D rendering device is projected screen area 3D model can be applied provided is one vertex determining vertex number 1. 3D 3D model rendering device is performed based on the speed or the like number 1 provided is determining characteristic information and rendering vertex can be applied.

[53]

Step (420) in, 3D rendering device is step (410) is controlled vertices [...] number 1 can be performed. 3D rendering device to determine an effect shading [...] vertex is determined by performing vertex unit number 1 can be. Step (430) in, number 1 [...] 3D rendering device is applied to be applied provided is based on the reference information indicative of whether the number 2 can be a zone is determined. Step (440) in, 3D rendering device is step (430) performing [...] number 2 can be determined at the pixel area. 3D rendering device is determined pixel area of pixel number 2 can be performed [...] determine an effect shading.

[54]

Step (450) in, number 1 and number 2 3D rendering device includes a shading effect can be based on shading effect rendering result Image.

[55]

The content is 1 to 4 also described can also reference content also described 2.

[56]

Figure 5 shows a flow in the embodiment according to 3D rendering method to explain the operation of the other are disclosed. In the embodiment according to 3D rendering method performed by at least one processor including 3D rendering the other device can be.

[57]

Step (510) in, 3D rendering device 3D rendering device includes a screen area is projected 3D model can be applied provided is one vertex determining vertex number 1.

[58]

Step (520) in, subdividing the surface model is 3D 3D rendering device can determine whether it is applied. Surface refinement is when applicable, step (530) in, 3D 3D rendering device is to perform the surface model can be subdivided. Subdividing surface through the surface of the 3D model can be reconstructed 3D model is partitioned into a plurality of regions. 3D rendering device is reconstructed 3D determining system for adapting a number 1 provided is vertex can be applied.

[59]

3D model surface connected with the subdividing is applied, step (540) in, 3D rendering device is step (510) for performing vertex determined in number 1 can be [...]. Or, step (540) is 3D rendering device in step (530) subdividing the surface of the vertex for number 1 can be determined based on the result of performing [...].

[60]

Step (550) in, number 1 [...] 3D rendering device is applied to be applied provided is based on the reference information indicative of whether the number 2 can be a zone is determined. Step (560) in, 3D rendering device includes a step (550) performing [...] number 2 can be determined at the pixel area. 3D rendering device is determined pixel area of pixel number 2 can be performed [...] determine an effect shading.

[61]

Step (570) in, number 1 and number 2 3D rendering device includes a shading effect can be based on shading effect rendering result Image.

[62]

The content is also described 5 described content reference can also 1 to 4 also.

[63]

Figure 6 shows a rendering device in the embodiment according to 3D to explain the configuration of the other are disclosed. 3D 3D rendering device for rendering 3D model (600) are input. 3D rendering device (600) can be performing rendering the 3D input model 3D. 3D rendering device (600) includes performing one or more 3D rendering method disclosed herein or can be shown. 3D rendering device (600) is rendering result Image (630) can be a groove.

[64]

3D rendering device (600) comprises a processor (610) and memory (620) can be a. Memory (620) comprises a processor (610) and communicates, processor (610) [...] or instructions executable by processor (610) can be stores data to be computed. Processor (610) the memory (620) comprises executing instructions stored in hardware. Processor (610) includes instructions in order to carry out, internal register, internal cache, memory (620) or taking out instructions from storage (storage) (retrieve or fetch), executing instructions can be. Then, processor (610) execution of one or more internal register result, internal cache, memory (620), or storage can be recorded. Processor (610) also includes 1 to 5 for performing one or more operations described also performs instruction can be. According to one in the embodiment, processor (610) includes a graphical process unit (Graphic Processing Unit, GPU) implementation being. In this case, vertex shader carried out in [...] number 1, number 2 utilizes a pixel shader or fragment can be carried out in [...].

[65]

In the embodiment described above are hardware components, software components, and/or hardware components and software components can be implemented in a combination of. For example, in the embodiment described in device, method and components, e.g., processor, controller, ALU (arithmetic logic unit), digital signal processor (digital signal processor), microcomputer, FPGA (field programmable gate array), PLU (programmable logic unit), microprocessor, or instructions (instruction) can respond to perform any other device such as, can be achieved using one or more general-purpose computer or a special purpose computer. Processing device number (OS) and said operating body number operating body is performed on one or more software applications can be performing. In addition, processing device is in response to the execution of software, data access, storage, operation, processing and generating disapproval. For facilitating the understanding, processing device described section if one is used, is in the art corresponding person with skill in the art, processing device (processing element) and/or a plurality of types of processing elements in the plurality of processing elements include can be cylindrical. For example, processing device has a plurality of processor or one processor and one controller can. In addition, such as parallel processor (parallel processor), an Image (processing configuration) also disclosed.

[66]

Software computer program (computer program), code (code), instruction (instruction), or a combination of one or more can be, processed to matingly receive the fitting device behaves as desired (collectively) that make up or independently or can be instructs a processing device. Software and/or data, processing device for receiving a command or data number under public affairs treated or interpreted by the device, what types of machine, component (component), physical device, virtual device (virtual equipment), a computer storage medium or device, or the signals to be transmitted wave (signal wave) permanently, or temporarily (embody) can be embodied. Software on a computer system connected to the dispersed in, dispersed method executed stored or retrieved disapproval. Software and data can be stored in one or more computer-readable recording medium.

[67]

In the embodiment according to method can be carried out through various computational means embodied in the form of computer-readable medium recording program instructions can be. The computer readable media comprises program instructions, data files, data structure or the like can be either alone or in combination. Computer-readable medium instructions are specially designed and constructed for recorded programs in the embodiment or computer software that may be enable those publicly known to one skilled in the disapproval. Examples of computer-readable recording medium hard disk, floppy disk and magnetic medium (magnetic media) such as tapes, cD-a rOM, DVD (optical media) such as optical recording medium, such as magnetic - optical media (media magneto-optical shutters) flop mote curl disk (floptical disk), and ROM (ROM), ram (RAM), such as flash memory and executing program instructions stored to hardware device specially configured multiple myelomas are included. Examples of such as machine code by a compiler program instructions are made can be carried out by a computer as well as interpreter order currently used in conventional language code comprises brilliance. Said one or more software modules for performing the operation of the hardware device in the embodiment is configured to operate as can be, and vice-versa are disclosed.

[68]

Although than defined by drawing in the embodiment are described but, if said person with skill in the art from the substrate in the art of various corresponding modifications and deformable disclosed. For example, techniques described herein or in a different order and are described method is carried out, and/or illustrating system, structure, device, such as a fixed or other method described component alone or in combination, or other suitable components or even replaced by uniformly substituted results can be achieved. Therefore, other implementations, other claim claim also carry on those two pieces in the embodiment and ranges.



[1]

The present invention relates to a three-dimensional rendering method and a device thereof. According to an embodiment, the three-dimensional rendering device determines a vertex to be applied to first shading of vertices of a three-dimensional model based on the characteristic information of the three-dimensional model, and performs the first shading on the determined vertex to determine a first shading effect. The three-dimensional rendering device determines a pixel area to which second shading is applied based on reference information indicating whether the first shading is applied, and performs the second shading in the pixel area to determine a second shading effect. The three-dimensional rendering device can generate a rendering result image based on the first shading effect and the second shading effect.

[2]

COPYRIGHT KIPO 2017

[3]

[4]

  • (110) Determiner
  • (120) First shader
  • (125,135) Shading performing unit
  • (130) Second shader
  • (140) Rendering image generator
  • (150) Divider
  • (AA) 3D model
  • (BB) Rendering result image



3D model based on characteristic information determining 3D model number 1 provided is one vertex vertex applied; said determining said number 1 vertices determined by performing shading effect [...] number 1; reference information indicative of whether said number 1 number 2 provided is applied [...] applied based on determining a pixel area; said determining said number 2 number 2 [...] determined pixel area by performing shading effect; and said number 1 shading effect and said number 2 including 3D rendering method step rendering result based on shading effect produced.

According to Claim 1, determining said number 1 provided is applied to the vertex, vertex between edge length projected screen area adjacent said 3D model based on said number 1 provided is vertex vertex applied determining, 3D rendering method.

According to Claim 1, determining said number 1 provided is applied to the vertex, the vertex formed based on said 3D model polygons area determining said number 1 provided is based on vertex applied, 3D rendering method.

According to Claim 1, subdividing said 3D model is determining whether the surface (surface subdivision); and said surface is divided into a plurality of regions when subdividing the surface of said 3D model further comprises, determining said number 1 provided is applied to the vertex, said vertex applied determining said number 1 provided is based on the division results, 3D rendering method.

According to Claim 1, the step of determining an area to be applied provided is said number 2, Image frame of pixels corresponding to the reference information based on said number 2 provided is applicable to determining pixel area, 3D rendering method.

According to Claim 1, determining said number 1 provided is applied to the vertex, said 3D model vertex density information, said 3D model vertex comprised polygons area information, vertex on the dielectric information projected screen area, and the distance between the virtual light sources projected screen area polygons area information based on said number 1 provided is at least one of vertex information applied determining vertex, 3D rendering method.

According to Claim 1, said 3D model characteristics information, said 3D model vertex information, location information of a virtual camera, a virtual camera direction information, location information and the virtual light sources in the direction information of virtual light sources based on at least one determined in, 3D rendering method.

According to Claim 1, [...] said number 1, said 3D model vertex unit determines said number 1 shading effect, said number 2 [...], said 3D model determining said number 2 is represented by a pixel-by-pixel video frame shading effect, 3D rendering method.

According to Claim 1, said number 2 provided is the step of determining an area to be applied, said 3D model rendering Institute and determining said number 2 provided is based on said reference information to be applied region including 3D rendering method.

One of hardware for executing method number 1 to number 9 combined with anti anti medium stored on a computer program terms.

3D model characteristic information among 3D model number 1 provided is based on determining that determines a vertex vertex applied; said determining said number 1 vertices determined by performing shading effect [...] number 1 number 1 shader; [...] reference information indicative of whether said number 1 applied and a zone is determined based on the number 2 provided is to be applied, said determined pixel area by performing shading effect [...] determining said number 2 number 2 number 2 shader; and said number 1 shading effect and said number 2 shading effect produced rendering Image generator including 3D rendering result based on the rendering device.

According to Claim 11, said number 1 [...], vertex with respect to said number 1 provided is projected screen area said 3D model performed whether may send vertex attribute an allocation result, 3D rendering device.

According to Claim 11, the determining, said 3D model surface refinement is and determine whether, subdividing said surface when said 3D model into a plurality of regions including 3D is surface of divider with the rendering device.

According to Claim 11, [...] said number 1, said 3D model vertex unit determines said number 1 shading effect, said number 2 [...], said 3D model determining said number 2 is represented by a pixel-by-pixel video frame shading effect, 3D rendering device.

Number 1 and number 2 to be applied to determining that determines a shading Image frame currently shading shading type; [...] accordance with number 1 when said determined type, determining said current Image frame by performing shading effect [...] number 1 number 1 number 1 shader; [...] number 2 when said determined type, determining said current Image frame by performing shading effect [...] number 2 number 2 number 2 shader; and said number 1 shading effect or rendering result based on said current Image frame said number 2 shading effect produced rendering Image generator including 3D rendering device.

According to Claim 15, the determining, said 3D model based on at least one of vertex information and said 3D model to be applied to said current Image frame Institute rendering determining shading type, 3D rendering device.



CPC - классификация

GG0G06G06TG06T1G06T15G06T15/G06T15/8G06T15/80

IPC - классификация

GG0G06G06TG06T1G06T15G06T15/G06T15/8G06T15/80