TRANSMITTING DEVICE, TRANSMITTING METHOD, RECEIVING DEVICE, RECEIVING METHOD, PROGRAM, AND BROADCASTING SYSTEM
This application claims the benefit of priority of Provisional Application Ser. No. 61/470,191, filed Mar. 31, 2011, the entire content of which is incorporated herein by reference. The present disclosure relates to a transmitting device, a transmitting method, a receiving device, a receiving method, a program, and a broadcasting system, and particularly to a transmitting device, a transmitting method, a receiving device, a receiving method, a program, and a broadcasting system that are suitable to be used in the case of executing content of data broadcasting in conjunction with the progression of show and commercial message (CM) in e.g. digital television broadcasting. In Japan, digitalization of television broadcasting is being promoted and terrestrial digital broadcasting, BS digital broadcasting, etc. are prevalent. In the digital television broadcasting such as the terrestrial digital broadcasting, not only broadcasting of shows of news, drama, movie, etc. but also so-called data broadcasting is also realized. According to content in this data broadcasting, for example information relating to the on-air show (performer, story, etc.) can be displayed and information having no relation to the on-air show (announcement of another show, news, weather forecast, traffic information, etc.) can be displayed (refer to e.g. Japanese Patent Laid-open No. 2006-50237). For the data broadcasting in Japan, in the digitalization of the television broadcasting, the band dedicated to the data broadcasting is ensured in advance in the broadcasting band of the digital television broadcasting. The data broadcasting in Japan is realized by broadcasting data broadcasting content by use of this dedicated band. In contrast, in the digital television broadcasting in the United States, the band dedicated to the data broadcasting like that in the digital television broadcasting in Japan is not ensured. Specifically, as shown in However, ensuring the band for broadcasting data broadcasting content by narrowing the band for video and the band for audio leads to the deterioration of image quality and sound quality of shows and therefore is far from practical countermeasures. Furthermore, even if the band for data broadcasting content is ensured by narrowing the band for video and the band for audio, the amount of data that can be transmitted is limited. Therefore, the amount of information of the data broadcasting content will be poor. If increasing the amount of information is attempted, it will take a long time for the receiving side to receive the necessary data. In addition, in the United States, a retransmission system for digital television shows by use of a cable TV (CATV) network is prevalent, and therefore possibly the following problem also occurs. This retransmission system is composed mainly of a broadcasting device 1, a CATV retransmission device 2, a CATV network 3, a digital television receiver 4, a set-top box (STB) 5, and a television receiver 6. The broadcasting device 1 set in e.g. a broadcasting station broadcasts a digital television broadcast signal by using a terrestrial wave or a satellite wave. The CATV retransmission device 2 set in e.g. a cable TV station receives the digital television broadcasting to remove unnecessary information and add original information of the CATV to the received broadcasting. Subsequently, the CATV retransmission device 2 retransmits the broadcasting to the digital television receiver 4, the set-top box 5, and so forth via the CATV network 3. The CATV retransmission device 2 includes a tuner 11, a PID filter 12 for filtering of a packet of a predetermined packet ID, a CATV original signal generator 13, a multiplexer 14, and a modulator 15. The tuner 11 receives and demodulates digital television broadcast signals of the respective channels, and outputs the resulting transport stream (TS) to the PID filter 12. The PID filter 12 removes a packet corresponding to a predetermined packet ID (packet having no relation to AV content as the show) from the TS and outputs the resulting TS to the multiplexer 14. The CATV original signal generator 13 generates a packet in which original information of the CATV station is stored and outputs it to the multiplexer 14. The multiplexer 14 multiplexes the output of the PID filter 12 and the output of the CATV original signal generator 13 and outputs the resulting TS to the modulator 15. The modulator 15 modulates the output of the multiplexer 14 by a modulation system suitable for the CATV network 3 and retransmits the modulated TS to the digital television receiver 4, the set-top box 5, and so forth via the CATV network 3. The digital television receiver 4 compliant with the CATV receives the TS of the retransmitted digital television broadcasting via the CATV network 3 and decodes the TS, to display the resulting video and output audio. The set-top box 5 compliant with the CATV receives the TS of the retransmitted digital television broadcasting via the CATV network 3 and decodes the TS, to output the resulting video signal and audio signal to the television receiver 6 via e.g. an HDMI cable. Based on the video signal and the audio signal input from the set-top box 5 via e.g. the HDMI cable, the television receiver 6 displays video and outputs audio. As described above, in the CATV retransmission device 2, the packet corresponding to the predetermined packet ID (packet having no relation to AV content as the show) is removed from the TS of the digital broadcast signal by the PID filter 12. Therefore, even if the band for broadcasting data broadcasting content is ensured in the broadcasting band as shown in There is a need for a technique to allow realization of data broadcasting content that can be in conjunction with the progression of show and CM of television broadcasting without setting the band for data broadcasting in the broadcasting band of digital television broadcasting. According to a first embodiment of the present disclosure, there is provided a transmitting device including an audio encoder configured to generate an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in a receiving device is buried, and a transmitter configured to transmit the generated encoded audio stream to the receiving device. The transmitting device further includes a controller configured to supply metadata in which the trigger information is stored and size information for burying the metadata in a user data area of the encoded audio stream, and carry out control so that the metadata may be buried in the user data area. The audio encoder encodes an audio stream by an AC3 (Audio Code number 3) system to generate the encoded audio stream, and the metadata is inserted in an area of AUX (AUXILIARY DATA) in the frame structure of the AC3 system. The audio encoder encodes an audio stream by an AAC (Advanced Audio Coding) system to generate the encoded audio stream, and the metadata is inserted in an area of DSE (Data Stream Element) in the frame structure of the AAC system. The transmitting device further includes a video encoder configured to encode a video stream to generate an encoded video stream, and a multiplexer configured to multiplex the encoded audio stream and the encoded video stream to generate a multiplexed stream. The transmitter transmits the generated multiplexed stream. Type information indicating the type of information is added to the metadata. A plurality of kinds of information distinguished by an information identifier are included in the metadata. A transmitting method or a program according to the first embodiment of the present disclosure is a transmitting method or a program corresponding to the above-described transmitting device according to the first embodiment of the present disclosure. In the first embodiment of the present disclosure, the encoded audio stream in which the trigger information relating to control of the application program to be executed in conjunction with content in the receiving device is buried is generated, and the generated encoded audio stream is transmitted to the receiving device. According to a second embodiment of the present disclosure, there is provided a receiving device including a receiver configured to receive an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content is buried. The encoded audio stream is transmitted from a transmitting device. The receiving device further includes an audio decoder configured to decode the received encoded audio stream, and a controller configured to control processing relating to the application program executed in conjunction with the content in response to the trigger information obtained by decoding the encoded audio stream. The audio decoder acquires the trigger information stored in metadata from an area of AUX in the frame structure of the encoded audio stream encoded by an AC3 system. The audio decoder acquires the trigger information stored in metadata from an area of DSE in the frame structure of the encoded audio stream encoded by an AAC system. The receiving device further includes a demultiplexer configured to demultiplex a received multiplexed stream, and a video decoder configured to decode an encoded video stream demultiplexed from the multiplexed stream. The audio decoder decodes the encoded audio stream demultiplexed from the multiplexed stream. A receiving method or a program according to the second embodiment of the present disclosure is a receiving method or a program corresponding to the above-described receiving device according to the second embodiment of the present disclosure. In the second embodiment of the present disclosure, the encoded audio stream in which the trigger information relating to control of the application program to be executed in conjunction with the content is buried, transmitted from the transmitting device, is received, and the received encoded audio stream is decoded. Furthermore, processing relating to the application program executed in conjunction with the content is controlled in response to the trigger information obtained by decoding the encoded audio stream. According to a third embodiment of the present disclosure, there is provided a broadcasting system including a transmitting device configured to transmit content, and a receiving device configured to receive transmitted content. The transmitting device includes an audio encoder that generates an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in the receiving device is buried, and a transmitter that transmits the generated encoded audio stream to the receiving device. The receiving device includes a receiver that receives the encoded audio stream transmitted from the transmitting device, an audio decoder that decodes the received encoded audio stream, and a controller that controls processing relating to the application program executed in conjunction with the content in response to the trigger information obtained by decoding the encoded audio stream. In the third embodiment of the present disclosure, by the transmitting device, the encoded audio stream in which the trigger information relating to control of the application program to be executed in conjunction with content in the receiving device is buried is generated, and the generated encoded audio stream is transmitted to the receiving device. By the receiving device, the encoded audio stream transmitted from the transmitting device is received, and the received encoded audio stream is decoded. Furthermore, processing relating to the application program executed in conjunction with the content is controlled in response to the trigger information obtained by decoding the encoded audio stream. According to the first embodiment of the present disclosure, data broadcasting content can be controlled in conjunction with the progression of show, CM, etc. of television broadcasting without setting the band for data broadcasting in the broadcasting band of digital television broadcasting. According to the second embodiment of the present disclosure, data broadcasting content can be controlled in conjunction with the progression of show, CM, etc. of television broadcasting without setting the band for data broadcasting in the broadcasting band of digital television broadcasting. According to the third embodiment of the present disclosure, data broadcasting content can be controlled in conjunction with the progression of show, CM, etc. of television broadcasting without setting the band for data broadcasting in the broadcasting band of digital television broadcasting. Best modes for carrying out the invention (hereinafter, referred to as embodiments) will be described in detail below with reference to the drawings. The data broadcasting content is realized by activation of an application program supplied to a receiving device by the receiving device. Therefore, hereinafter, the data broadcasting content will be referred to also as data broadcasting application program or data broadcasting application. The data broadcasting application may be configured from one program data or may be a program data group composed of plural program data. This broadcasting system 30 is composed of a broadcasting device 41 and a server 42 that are provided on the broadcasting station side, and a reproducing device 59 and a receiving device 60 that are provided on the receiver side. The broadcasting device 41 transmits a digital television broadcast signal. Furthermore, the broadcasting device 41 transmits trigger information as a command relating to the operation of data broadcasting content linked to AV content in such a manner that the trigger information is included in the digital television broadcast signal. Specifically, the trigger information is so transmitted as to be disposed in the transport stream (TS) of the digital television broadcast signal or buried in a video signal. The trigger information includes information indicating the kind of command, information indicating the acquisition source of a data broadcasting application, and so forth. Details of the trigger information will be described later. The server 42 supplies a data broadcasting application in response to a request from the receiving device 60 that accesses the server 42 via the Internet 50. The receiving device 60 receives a digital broadcast signal broadcast from the broadcasting device 41 and outputs video and audio of AV content to a monitor (not shown). Furthermore, the receiving device 60 accesses the server 42 via the Internet 50 and acquires data broadcasting content. It is to be noted that, this receiving device 60 may exist as a single device or may be included in e.g. a television receiver or a video recorder. The reproducing device 59 reproduces video and audio of AV content recorded in e.g. a predetermined recording medium and outputs them to the receiving device 60. The controller 51 generates trigger information in association with the progression of a video stream of show and CM input from the previous stage and outputs it to the video encoder 52 and the multiplexer 54. Furthermore, the controller 51 generates metadata in which trigger information is stored in association with the progression of the video stream and outputs it to the audio encoder 53 and the multiplexer 54. In the metadata, predetermined information having no direct relation to audio data, such as trigger information, is stored. Details of the metadata will be described later. The video encoder 52 encodes the video stream of show and CM input from the previous stage in accordance with a predetermined encoding system and outputs the resulting encoded video stream to the multiplexer 54. Examples of the encoding system in the video encoder 52 include MPEG2 system and H.264 system. In the encoding of the video stream, the video encoder 52 buries the trigger information from the controller 51 in the video stream and encodes the video stream, to output the resulting encoded video stream to the multiplexer 54. The audio encoder 53 encodes the audio stream corresponding to the video stream input to the video encoder 52 in accordance with a predetermined encoding system and outputs the resulting encoded audio stream to the multiplexer 54. Examples of the encoding system in the audio encoder 53 include AC3 (Audio Code number 3) system and AAC (Advanced Audio Coding) system. In the encoding of the audio stream, the audio encoder 53 buries the metadata from the controller 51 in the audio stream and encodes the audio stream, to output the resulting encoded audio stream to the multiplexer 54. The multiplexer 54 multiplexes the input encoded video stream and encoded audio stream and multiplexes also the trigger information or the metadata to output the resulting multiplexed stream to the sender 55. Specifically, the multiplexer 54 multiplexes the streams into e.g. a transport stream (TS). Alternatively, in consideration of network delivery of digital television broadcasting, the multiplexer 54 may multiplex the streams into the ISO base media file format (MP4) suitable for the network delivery. In the above description, the trigger information is buried in the video stream and the metadata is buried in the audio stream. In addition, the trigger information or the metadata is multiplexed into the multiplexed stream. However, only one of these steps may be carried out. Alternatively, the combination of any of these steps may be carried out. The sender 55 sends out the input multiplexed stream as a digital television broadcast signal. The tuner 61 receives and demodulates a digital television broadcast signal corresponding to the channel selected by the user and outputs the resulting TS to the demultiplexer 62. The demultiplexer 62 demultiplexes the TS input from the tuner 61 into an encoded audio stream (audio encoded signal), an encoded video stream (video encoded signal), and a control signal and outputs them to the switch 77, the video decoder 65, or the controller 68. Furthermore, the demultiplexer 62 extracts a PCR packet including trigger information disposed in the TS and outputs it to the trigger detector 66. To the HDMI I/F 76, AV content sent from the reproducing device 59 by communication compliant with the HDMI (High Definition Multimedia Interface) is input. The HDMI I/F 76 outputs the encoded audio stream (audio encoded signal) of the AV content from the reproducing device 59 to the switch 77 and outputs a video signal to the switch 78. As the input from the reproducing device 59 to the HDMI I/F 76, an audio signal is input in the decoded state in some cases, and an audio signal is input in the undecoded state in other cases. The present description will deal with the case in which an encoded audio stream is input. To the switch 77, the encoded audio stream from the demultiplexer 62 and the encoded audio stream from the HDMI I/F 76 are input. The switch 77 outputs one of the input encoded audio streams to the audio decoder 63 based on a preset instruction from the user. The audio decoder 63 decodes the input encoded audio stream and outputs the resulting audio stream (audio signal) to the audio output part 64 and the trigger detector 66. The audio output part 64 outputs the input audio signal to the subsequent stage (e.g. speaker). The video decoder 65 decodes the input encoded video stream and outputs the resulting video stream (video signal) to the trigger detector 66 and the switch 78. The trigger detector 66 detects the trigger information stored in the metadata buried in the input audio stream and outputs it to the controller 68. Furthermore, the trigger detector 66 detects the trigger information buried in the input video stream and outputs it to the controller 68 (if the trigger information is disposed only in the TS, these operations of the trigger detector 66 are unnecessary). In addition, the trigger detector 66 extracts the trigger information or the trigger information stored in the metadata from the PCR packet including the trigger information, input from the demultiplexer 62, and outputs it to the controller 68. With omission of the trigger detector 66, the trigger information may be extracted from the audio stream in the audio decoder 63 and the trigger information may be extracted from the video stream in the video decoder 65. In this case, in the demultiplexer 62, the trigger information multiplexed into the multiplexed stream is demultiplexed to be output directly to the controller 68. To the switch 78, the video signal from the HDMI I/F 76 and the video signal from the video decoder 65 are input. The switch 78 outputs one of the input video signals to the video output part 67 based on a preset instruction from the user. The video output part 67 outputs the video signal input from the switch 78 to the subsequent stage (e.g. display). Furthermore, the video output part 67 combines the video of data broadcasting content input from the application engine 74 and the video signal input from the video decoder 65 and outputs the resulting signal to the subsequent stage. The controller 68 runs a control program recorded in the memory 69 to thereby control the whole receiving device 60. Furthermore, the controller 68 controls acquisition, registration, activation, event firing, suspension, resume, stop, and so forth of the data broadcasting application based on the trigger information input from the trigger detector 66. In the memory 69, the control program run by the controller 68 is recorded. This control program can be updated based on the digital television broadcast signal or update data supplied via the Internet 50. The operation part 70 accepts various kinds of operation from the user and notifies the controller 68 of an operation signal corresponding to the operation. If the data broadcasting application is delivered by using the digital television broadcast signal, the recording part 71 retains the downloaded data broadcasting application in a recording medium such as a built-in hard disk. The communication I/F 72 connects to the server 42 via the Internet 50 in accordance with control from the application engine 74. The application engine 74 acquires the data broadcasting application from the server 42 via the communication I/F 72 and the Internet 50 in accordance with control from the controller 68 and makes the cache memory 73 retain it. The application engine 74 reads out and runs the data broadcasting application retained in the recording part 71 or the cache memory 73 in accordance with control from the controller 68. The application memory 75 is composed of a work memory 75A and a save memory 75B. The application engine 74 records data relating to the running data broadcasting application (specifically, including the hierarchy of displayed information and so forth) in the work memory 75A. Furthermore, when suspending the running data broadcasting application, the application engine 74 moves the data in the work memory 75A of the application memory 75 to the save memory 75B. When resuming the suspended data broadcasting application, the application engine 74 moves the data in the save memory 75B to the work memory 75A to restore the state before the suspension. One and the other of two areas having the same size in the application memory 75 may be alternately switched to the work memory 75A and the save memory 75B. This can omit the data movement between the work memory 75A and the save memory 75B. The transmitting method of trigger information will be described below. As the transmitting method of trigger information, the following four kinds of methods are possible.
Among the above-described methods (a) to (d), in the methods (b) and (c), trigger information is inserted as it is. In the methods (a) and (d), trigger information is inserted with a generic metadata transmission format including information other than the trigger information. In the method (a), trigger information may be inserted as it is. The metadata generic syntax used in the above-described methods (a) and (d) will be described below. sync_byte is a unique word indicating metadata container. metadata_type indicates the type information of the metadata. This type information makes it possible to selectively transmit metadata of plural types. For example, 00000011 indicates that the metadata to be transmitted is trigger information. metadata_length indicates the number of subsequent bytes. metadata_ID is information for identifying the kind in the type of the metadata. This identifier makes it possible to simultaneously transmit plural kinds of information of the same type. metadata_counter is count information indicating what number divided information the information to be transmitted is when the series of metadata is so transmitted as to be divided. This count information is the count value of the counter incremented every audio frame. metadata_start_flag indicates whether or not the information to be transmitted is the first divided information when the series of metadata (metadata packet) is so transmitted as to be divided. For example, 1 indicates that the information is the first divided information, and 0 indicates that the information is not the first divided information but the divided information subsequent to the divided information of the previous frame. sync_control_flag indicates whether or not the metadata is synchronously managed. 1 indicates that the metadata is synchronously managed by PTS in PTS_management( ). 0 indicates that the metadata is not synchronously managed. When sync_control_flag is 1, PTS_management( ) exists. Referring back to packet_type indicates the type information of the metadata similarly to metadata_type of metadata( ) ( For example, if information stored in the metadata is trigger information, metadata_Packet( )of the trigger information, i.e. Trigger_info_data( ) ( The information stored in the metadata may be information other than the trigger information. For example, other service access information (Metadata for linking service) and disparity information (Metadata for disparity shifting data) can be stored. Details of the other service access information and the disparity information are described in e.g. Japanese Patent Application No. 2011-061549, which is an application by the present assignee. [(a) Method in which Trigger Information is Inserted in PCR Packet] Details of the respective transmitting methods of trigger information will be described below. As shown in As the trigger information, information of the same contents is transmitted plural times successively in consideration of radio interference and acquisition imperfection (reception miss) in the receiving device 60. As shown in In this manner, in the method (a), in which trigger information is inserted in a PCR packet, the trigger information or metadata in which the trigger information is stored is stored in transport_private_data_byte of the PCR packet. [(b) Method in which Trigger Information is Buried in Video Signal] In either example of Furthermore, in either example of In this manner, in the method (b), in which trigger information is buried in a video signal, the trigger information is buried in a predetermined area of the image of the video signal. [(c) Method in which Trigger Information is Inserted in Encoded Video Stream] If trigger information is inserted in an encoded video stream of MPEG2, user_data in the picture layer in video_sequence( ) is utilized. In user_data_start_code, 0x000001B2 is described as a fixed value. In Trigger_Info_Data_identifier, 0x54524749 (“TRGI”) is described as a fixed value. In Trigger_info_data( ), trigger information, i.e. Trigger_info_data( ) ( In this manner, in the method (c), in which trigger information is inserted in an encoded video stream, the trigger information is inserted in an area of user_data( ) of video_sequence( ). [(d) Method in which Trigger Information is Inserted in Encoded Audio Stream] In this manner, in the method (d), in which trigger information is inserted in an encoded audio stream, metadata in which the trigger information is stored is inserted in an area of AUX if encoding is performed by the AC3 system, and is inserted in an area of DSE if encoding is performed by the AAC system. Although the cases in which the AC3 system and the AAC system are employed as the encoding system have been described, it is also possible to apply this method to another encoding system. Details of trigger information will be described below. The trigger information is classified into five kinds depending on the kind of command included in the trigger information. Two combinations of five kinds of commands have been proposed. The first combination (hereinafter, referred to as first embodiment) is composed of commands of Pre_cache, Execute, Inject_event, Suspend, and Terminate. The second combination (hereinafter, referred to as second embodiment) is composed of commands of Register, Execute, Inject_event, Suspend, and Terminate. First, five kinds of commands in the first embodiment will be described. The second embodiment will be described later with reference to Trigger_id is information for identification of this trigger information. If trigger information of the same contents is transmitted plural times, Trigger_id of the respective pieces of trigger information is the same. Protocol_version indicates the version of the protocol of this trigger information. Command_code indicates the kind of command of this trigger information. In the case of Trigger_validity is a parameter value N of server access distribution indicating the probability that the respective receiving devices 60 that have received this trigger information execute processing in accordance with this trigger information. Due to the provision of this value, in acquisition of a data broadcasting application by the plural receiving devices 60 from the server 42, the access can be distributed without concentrating on one period. For example, to distribute access from the receiving devices 60, which possibly exist in large numbers, to the server 42 into four times of access, the same trigger information is transmitted four times and the parameter N of server access distribution is set as follows. Specifically, the parameter N in the trigger information of the first round is set to 4, and the parameter N in the trigger information of the second round is set to 3. In addition, the parameter N in the trigger information of the third round is set to 2, and the parameter N in the trigger information of the fourth round is set to 1. App_id is identification information of the data broadcasting application acquired corresponding to this trigger information. App_type is information indicating the type (e.g. HTML5, java) of the data broadcasting application corresponding to this trigger information. App_url is the URL of the acquisition source of the data broadcasting application. Broadcast_App_flag, Downloaded_App_flag, and Internet_App_flag are flags indicating where the data broadcasting application corresponding to this trigger information exists. Broadcast_App_flag is set to 1 if the data broadcasting application corresponding to this trigger information can be acquired from a digital television broadcast signal. Downloaded_App_flag is set to 1 if the data broadcasting application corresponding to this trigger information has been already downloaded and can be acquired from a local storage (e.g. recording part 71). Internet_App_flag is set to 1 if the data broadcasting application corresponding to this trigger information can be acquired from the server 42 via the Internet 50. Trigger_id, Protocol_version, Command_code, Trigger_validity, App_id, App_type, App_url, Broadcast_App_flag, Downloaded_App_flag, and Internet_App_flag are the same as those of the trigger information as the Pre_cache command shown in App_life_scope indicates the range in which the running state is continued without stopping the running data broadcasting application when switching of e.g. the channel occurs. App_expire_date indicates the time and date when the running data broadcasting application is stopped although the Terminate command is not received. Trigger_id, Protocol_version, Command_code, Trigger_validity, App_id, and App_type are the same as those of the trigger information as the Pre_cache command shown in Event_id is identification information of the event that should be fired in the data broadcasting application specified by App_id. In Event Embedded Data, the data used as reference in event firing is described. Trigger_id, Protocol_version, Command_code, Trigger_validity, App_id, and App_type are the same as those of the trigger information as the Pre_cache command shown in Trigger_id, Protocol_version, Command_code, Trigger_validity, App_id, and App_type are the same as those of the trigger information as the Pre_cache command shown in The operation of the receiving device 60 in accordance with trigger information will be described below. For example, as shown in If the user selects this icon, as shown in When the contents of the show further transition (in the present case, transition to sports information), trigger information of the Inject_event command is transmitted in conjunction with this transition. When this trigger information is received, an event is fired, and the displaying by the data broadcasting application on the screen is changed as shown in Thereafter, prior to CM broadcasting, trigger information of the Suspend command for the running data broadcasting application corresponding to the show is transmitted. When this trigger information is received, the data broadcasting application corresponding to the show is suspended. Thereafter, trigger information of the Execute command for the data broadcasting application corresponding to the CM is transmitted. When this trigger information is received, the data broadcasting application of the CM is activated. Thereby, as shown in If the user selects this icon, displaying by the data broadcasting application corresponding to the CM (in the present case, displaying for participation in prize competition) is carried out on the screen. After the end of the CM, in synchronization with show resumption, trigger information of the Execute command for the data broadcasting application corresponding to the show is transmitted. When the trigger information is received, as shown in When the show ends, in conjunction with this end, trigger information of the Terminate command for the data broadcasting application corresponding to the show is transmitted. When this trigger information is received, the data broadcasting application is stopped, and the displaying of the data broadcasting application is erased from the screen and only the video of the show is displayed as shown in The method for displaying the data broadcasting application is not limited to the method in which displaying of the show is reduced to make the area for displaying of the data broadcasting application as shown in With reference to In a step S101, the controller 51 generates trigger information associated with the progression of a video stream of show and CM input from the previous stage. In a step S102, the video encoder 52 encodes the video stream of show and CM input from the previous stage and outputs the resulting encoded video stream to the multiplexer 54. In a step S103, the controller 51 determines whether or not to insert the trigger information in an encoded audio stream based on a preset instruction from the user. If it is determined to insert the trigger information in the encoded audio stream, the processing is forwarded to a step S104. In the step S104, the controller 51 generates metadata based on the trigger information and outputs the metadata to the audio encoder 53 together with size information for burying this metadata in the user data area. In a step S105, the audio encoder 53 encodes an audio stream and inserts the metadata from the controller 51 in the audio stream based on the size information from the controller 51, to output the resulting encoded audio stream to the multiplexer 54. For example, if the encoding system is the AC3 system ( The audio encoder 53 performs the encoding with the aim at the size S as the target value, and encodes audio data in such a manner that the total size of mantissa data of Audblock 5, AUX, and CRC does not surpass 3/8 of the whole. Furthermore, the audio encoder 53 inserts the metadata in an area of AUX and makes CRC to complete the stream. Thereby, in the area of AUX (AUXILIARY DATA) in If the encoding system is the AAC system ( It is also possible for the audio encoder 53 to perform the encoding twice in a divided manner. In this case, first the audio encoder 53 performs normal encoding, i.e. encoding for the case in which DSE or AUX is absent, and thereafter inserts the metadata in DSE or AUX with the size reserved in advance and performs encoding again. In the above-described manner, in the audio encoder 53, processing for burying the metadata in the user data area of the encoded audio stream (e.g. AUX in the case of the AC3system or DSE in the case of the AAC system) is executed, and the processing is forwarded to a step S107. If it is determined in the step S103 not to insert the trigger information in the encoded audio stream, the processing is forwarded to a step S106. In the step S106, the audio encoder 53 encodes the audio stream and outputs the resulting encoded audio stream to the multiplexer 54. Thereafter, the processing is forwarded to the step S107. In the step S107, the multiplexer 54 multiplexes the encoded video stream output from the video encoder 52 and the encoded audio stream output from the audio encoder 53 and outputs the resulting multiplexed stream to the sender 55. In a step S108, the sender 55 sends out (transmits) the multiplexed stream input from the multiplexer 54 as a digital television broadcast signal. Thereafter, the processing is returned to the step S101, so that this and subsequent steps are repeated. This is the end of the description of the trigger information transmission processing. In the above description of the trigger information transmission processing, among the above-described methods (a) to (d) for transmitting trigger information, the method (d), in which trigger information is inserted in an encoded audio stream, is explained. However, trigger information and metadata can be buried also in the methods (a) to (c) similarly. For example, if the method (a) is employed, trigger information or metadata is inserted in a PCR packet in the multiplexing by the multiplexer 54. If the method (b) is employed, trigger information is buried in a video signal of a video stream. If the method (c) is employed, trigger information is inserted in an encoded video stream in the encoding by the video encoder 52. Trigger information response processing when the receiving device 60 receives trigger information will be described below with reference to In a step S1, the trigger detector 66 determines whether or not trigger information is received. As the condition of this determination, if the above-described method (a) is employed, the trigger detector 66 waits until a PCR packet including trigger information is input from the demultiplexer 62. If the method (b) or (c) is employed, the trigger detector 66 waits until trigger information is detected from a video signal output from the video decoder 65. If the method (d) is employed, the trigger detector 66 waits until trigger information stored in metadata is detected from an audio signal output from the audio decoder 63. If a PCR packet including trigger information is input or trigger information is detected, the processing is forwarded to a step S2. In the step S2, the trigger detector 66 outputs the trigger information to the controller 68. The controller 68 reads out Trigger_id of the trigger information and determines whether or not the processing of a step S3 and subsequent steps has been already executed for this trigger information. If it is determined that the processing of the step S3 and subsequent steps has been already executed, the processing is returned to the step S1, so that this and subsequent steps are repeated. In contrast, if it is determined that the processing of the step S3 and subsequent steps has not been executed for this trigger information, the processing is forwarded to the step S3. In the step S3, the controller 68 reads out Command_code of the trigger information and determines which of the following the command indicated by this trigger information is: Pre_cache, Execute, Inject_event, Suspend, and Terminate. In a step S4, the controller 68 determines whether or not the determination result of the step S3 is Pre_cache. If it is determined that the determination result is Pre_cache, the processing is forwarded to a step S5. In the step S5, the controller 68 causes acquisition of the data broadcasting application specified by App_id of this trigger information. Specifically, if Broadcast_App_flag of this trigger information is 1, the data broadcasting application specified by App_id is acquired from a television broadcast signal and recorded in the recording part 71. If Downloaded_App_flag of this trigger information is 1, the data broadcasting application specified by App_id is acquired from the recording part 71 as a local storage. If Internet_App_flag of this trigger information is 1, the data broadcasting application specified by App_id is acquired from the server 42 via the Internet 50 and recorded in the cache memory 73. If two or more flags of Broadcast_App_flag, Downloaded_App_flag, and Internet_App_flag are 1, the data broadcasting application specified by App_id of this trigger information can be acquired depending on convenience for the receiving device 60. Thereafter, the processing is returned to the step S1, so that this and subsequent steps are repeated. If it is determined in the step S4 that the determination result of the step S3 is not Pre_cache, the processing is forwarded to a step S6. In the step S6, the controller 68 determines whether or not the determination result of the step S3 is Execute. If it is determined that the determination result is Execute, the processing is forwarded to a step S7. In the step S7, the application engine 74 determines whether or not the data broadcasting application specified by App_id of this trigger information is dormant (in the suspended state) in accordance with control from the controller 68. Specifically, it is determined that the data broadcasting application is dormant if data indicating the suspended state of the data broadcasting application specified by App_id is saved in the save memory 75B. If it is determined in the step S7 that the data broadcasting application specified by App_id is not dormant, the processing is forwarded to a step S8. In the step S8, in accordance with control from the controller 68, the application engine 74 acquires the data broadcasting application specified by App_id if this data broadcasting application has not yet been acquired (does not exist in the recording part 71 or the cache memory 73). In a step S9, if a currently-running data broadcasting application exists, the application engine 74 stops it in accordance with control from the controller 68. In a step S10, the application engine 74 activates the data broadcasting application specified by App_id in accordance with control from the controller 68. Thereafter, the processing is returned to the step S1, so that this and subsequent steps are repeated. If it is determined in the step S7 that the data broadcasting application specified by App_id is dormant (in the suspended state), the processing is forwarded to a step S11. In the step S11, the application engine 74 moves data in the save memory 75B to the work memory 75A and activates the data broadcasting application specified by App_id in accordance with control from the controller 68. Thereby, the dormant data broadcasting application specified by App_id is resumed from the suspended state. Thereafter, the processing is returned to the step S1, so that this and subsequent steps are repeated. If it is determined in the step S6 that the determination result of the step S3 is not Execute, the processing is forwarded to a step S12. In the step S12, the controller 68 determines whether or not the determination result of the step S3 is Inject_event. If it is determined that the determination result is Inject_event, the processing is forwarded to a step S13. In the step S13, only when App_id of this trigger information corresponds with App_id of the running data broadcasting application, the controller 68 controls the application engine 74 to fire (execute) the event corresponding to Event_id of the trigger information in the running application. Thereafter, the processing is returned to the step S1, so that this and subsequent steps are repeated. If it is determined in the step S12 that the determination result of the step S3 is not Inject_event, the processing is forwarded to a step S14. In the step S14, the controller 68 determines whether or not the determination result of the step S3 is Suspend. If it is determined that the determination result is Suspend, the processing is forwarded to a step S15. In the step S15, the application engine 74 saves, in the save memory 75B, data indicating the state of the currently-running data broadcasting application (i.e. data currently written to the work memory 75A, including information indicating the hierarchy of the displayed information if a hierarchical structure exists in the displayed information) in accordance with control from the controller 68. Thereafter, the processing is returned to the step S1, so that this and subsequent steps are repeated. If it is determined in the step S14 that the determination result of the step S3 is not Suspend, the determination result of the step S3 is Terminate and thus the processing is forwarded to a step S16. In the step S16, if the data broadcasting application specified by App_id is running, the application engine 74 stops it in accordance with control from the controller 68. In a step S17, in accordance with control from the controller 68, the application engine 74 erases data relating to the data broadcasting application specified by App_id from the work memory 75A and the save memory 75B and erases the data broadcasting application from the recording part 71 or the cache memory 73. Thereafter, the processing is returned to the step S1, so that this and subsequent steps are repeated. This is the end of the description of the trigger information response processing. The above-described trigger information response processing enables activation of a data broadcasting application, event firing, and stop of the data broadcasting application in conjunction with AV content (show, CM, etc.) of television broadcasting. Furthermore, a data broadcasting application can be suspended in such a manner that the state immediately before suspend is held, and another data broadcasting application can be executed and stopped. Thereafter, the suspended data broadcasting application can be resumed from the suspended state. The above-described trigger information response processing enables operation of a data broadcasting application like that shown in The broadcasting device 41 transmits trigger information of the Pre_cache command instructing acquisition of the data broadcasting application corresponding to a show in conjunction with the progression of the show. Thereupon, the data broadcasting application is acquired in the receiving device 60 that has received the trigger information. Next, the broadcasting device 41 transmits trigger information of the Execute command for the data broadcasting application corresponding to the show in conjunction with the progression of the show. Thereupon, the data broadcasting application is launched in the receiving device 60 that has received the trigger information. By this launch, an icon indicating that displaying of the data broadcasting application is ready is so displayed as to be superimposed on the video of the show. If the user selects this icon, displaying by the data broadcasting application is superimposed on the video of the show on the screen. The broadcasting device 41 transmits trigger information of the Inject_event command in conjunction with the progression of the show. Thereupon, in the receiving device 60 that has received the trigger information, an event is fired in the running data broadcasting application (e.g. displaying is changed). Subsequently, at a predetermined timing, the broadcasting device 41 transmits trigger information of the Suspend command for the data broadcasting application. Thereupon, in the receiving device 60 that has received the trigger information, the running data broadcasting application is suspended (relevant data is retained in the save memory 75B). Thereafter, the broadcasting device 41 transmits trigger information of the Execute command for the data broadcasting application. Thereupon, in the receiving device 60 that has received the trigger information, the suspended data broadcasting application is resumed. Furthermore, the broadcasting device 41 transmits trigger information of the Terminate command in conjunction with the end of the show. Thereupon, in the receiving device 60 that has received the trigger information, the running data broadcasting application is stopped. As shown in The Stopped state refers to the state in which the data broadcasting application has not yet been acquired into the receiving device 60. The Ready state refers to the state in which the data broadcasting application has been acquired into the receiving device 60 and is not activated. The Active state refers to the state in which the data broadcasting application is activated and running. The Suspended state refers to the state in which the execution of the data broadcasting application is interrupted and information indicating the state of the interruption timing is retained in the save memory 75B. When the data broadcasting application has transitioned to the Stopped state (has not yet been acquired into the receiving device 60), transition to the Ready state occurs if trigger information of the Pre_cache command is received and the data broadcasting application is acquired in accordance with the Pre_cache command. When the data broadcasting application is in the Ready state, transition to the Active state occurs if trigger information of the Execute command is received and the data broadcasting application is activated in accordance with the Execute command. When the data broadcasting application has transitioned to the Stopped state (has not yet been acquired into the receiving device 60), transition to the Active state occurs if trigger information of the Execute command is received and the data broadcasting application is acquired and activated in accordance with the Execute command. When the data broadcasting application has transitioned to the Active state, transition to the Suspended state occurs if trigger information of the Suspend command is received and the running data broadcasting application is interrupted in accordance with the Suspend command. When the data broadcasting application has transitioned to the Suspended state, transition to the Active state occurs if trigger information of the Execute command is received and the interrupted data broadcasting application is resumed in accordance with the Execute command. When the data broadcasting application has transitioned to the Ready state, the Active state, or the Suspended state, transition to the Stopped state occurs if trigger information of the Terminate command is received and the data broadcasting application is stopped in accordance with the Terminate command. The transition to the Stopped state is not limited to transition based on trigger information of the Terminate command. The transition to the Stopped state occurs also when App_expire_date of trigger information passes, when another data broadcasting application is executed, and when switching of the reception channel is beyond App_life_scope. The state transition of plural data broadcasting applications that can be sequentially executed in the receiving device 60 will be described below. When a show is started, the data broadcasting applications A, B, and C are all in the Stopped state. When the Execute command for the data broadcasting application A is received, the data broadcasting application A is acquired and activated to become the Active state. At this time, in the work memory 75A, data relating to the data broadcasting application A is written. Next, when the Suspend command for the data broadcasting application A is received, the data relating to the data broadcasting application A, written to the work memory 75A, is moved to the save memory 75B and the data broadcasting application A becomes the Suspended state. Thereafter, when the Execute command for the data broadcasting application B is received, the data broadcasting application B is acquired and activated to become the Active state. At this time, in the work memory 75A, data relating to the data broadcasting application B is written. If the Pre_cache command for the data broadcasting application C is received when the data broadcasting application B is in the Active state, the data broadcasting application C is acquired to become the Ready state. Next, when the Execute command for the data broadcasting application A is received, because the data broadcasting application A is in the Suspended state, the data relating to the data broadcasting application A is moved from the save memory 75B to the work memory 75A and the data broadcasting application A is resumed. The data broadcasting application B is stopped. Thereafter, when the Execute command for the data broadcasting application C in the Ready state is received, the data broadcasting application C is read out and activated to become the Active state. In the work memory 75A, data relating to the data broadcasting application C is written. The data broadcasting application A is stopped. As described above, plural data broadcasting applications can be sequentially executed in the receiving device 60 and a suspended data broadcasting application can also be resumed from the suspended state. The second embodiment will be described below. As described above, five kinds of commands in the second embodiment are the commands of Register, Execute, Inject_event, Suspend, and Terminate. Specifically, this Register command is the same as the Pre_cache command in the first embodiment in that it instructs acquisition of a data broadcasting application, but is different in that it instructs also registration of the data broadcasting application. The registration of a data broadcasting application means that the priority (Persistent_priority) and retention time limit (Expire_date) of the acquired data broadcasting application are stored in association with this data broadcasting application. The stored data broadcasting application is managed in accordance with the priority and the retention time limit by the controller 68 (details will be described later). Trigger_id, Protocol_version, Command_code, Trigger_validity, and App_type are the same as those of the trigger information as the Pre_cache command shown in App_id in the second embodiment is identification information of the data broadcasting application acquired corresponding to this trigger information and also indicates the URL of the acquisition source thereof (in the present case, server 42). In other words, the acquisition source of the data broadcasting application is diverted also to the identification information thereof and set as App_id. Therefore, in the trigger information as the Register command, the item of App_url, which exists in the trigger information as the Pre_cache command shown in Persistent_priority indicates the priority of acquisition and retention of the corresponding data broadcasting application. In the present case, two bits are allocated to Persistent_priority and thus four levels of priority can be given. In acquisition and retention of the corresponding data broadcasting application, if the recording capacity to retain it is not left in the recording part 71, a data broadcasting application having priority lower than that of the corresponding data broadcasting application is erased from the recording part 71 to thereby ensure the recording capacity. If a data broadcasting application having priority lower than that of the corresponding data broadcasting application is not retained in the recording part 71, the corresponding data broadcasting application is not acquired. However, if possible, it may be acquired and temporarily retained in the cache memory 73. Expire_date indicates the retention time limit of the corresponding data broadcasting application retained in the recording part 71. If the retention time limit passes, the corresponding data broadcasting application is erased from the recording part 71. Items included in the trigger information as the Execute command are the same as those included in the trigger information as the Register command shown in Items included in the trigger information as the Inject_event command in the second embodiment are the same as those in the first embodiment, shown in Items included in the trigger information as the Suspend command in the second embodiment are the same as those in the first embodiment, shown in Items included in the trigger information as the Terminate command in the second embodiment are the same as those in the first embodiment, shown in The trigger response processing in the receiving device 60 in the second embodiment is substantially the same as that in the above-described first embodiment. A difference is as follows. In the trigger response processing in the first embodiment, a data broadcasting application is acquired and stored (step S5 in In a step S31, the controller 68 determines whether or not the recording capacity to retain the data broadcasting application specified by the trigger information is left in the recording part 71. If it is determined that the recording capacity is left, the processing is forwarded to a step S34. In contrast, if it is determined that the recording capacity is not left, the processing is forwarded to a step S32. In the step S32, the controller 68 erases, from the recording part 71, a data broadcasting application whose priority is lower than that of the data broadcasting application specified by the trigger information among the data broadcasting applications retained in the recording part 71 (i.e. data broadcasting applications that have been already registered). In a step S33, the controller 68 determines whether or not the recording capacity to retain the data broadcasting application specified by the trigger information could be ensured in the recording part 71. If it is determined that the recording capacity could be ensured, the processing is forwarded to the step S34. In contrast, if it is determined that the recording capacity could not be ensured, the corresponding data broadcasting application is not acquired and the application registration management processing is ended. In the step S34, the controller 68 makes the data broadcasting application be acquired from its acquisition source indicated by App_id of the trigger information and be retained in the recording part 71. In a step S35, the controller 68 registers the acquired and retained data broadcasting application (manages the data broadcasting application in such a manner as to associate it with its priority and retention time limit). Through the above-described steps, the application registration management processing is ended. The registered data broadcasting application is erased from the recording part 71 when its retention time limit passes. Thereby, the registration of this data broadcasting application is deleted. The Released state refers to the state in which the data broadcasting application has not yet been acquired into the receiving device 60. The Ready state refers to the state in which the data broadcasting application has been registered in the receiving device 60 and is not activated. The Active state refers to the state in which the data broadcasting application is activated and running. The Suspended state refers to the state in which the execution of the data broadcasting application is interrupted and information indicating the state of the interruption timing is retained in the save memory 75B. When the data broadcasting application has transitioned to the Released state (has not yet been acquired into the receiving device 60), transition to the Ready state occurs if the data broadcasting application is acquired and retained to be registered in response to trigger information of the Register command. When the data broadcasting application is in the Ready state, transition to the Active state occurs if the data broadcasting application is activated in response to trigger information of the Execute command. When the data broadcasting application has transitioned to the Released state (has not yet been acquired into the receiving device 60), transition to the Active state occurs if the data broadcasting application is acquired and registered to be activated in response to trigger information of the Execute command. When the data broadcasting application has transitioned to the Active state, transition to the Suspended state occurs if the running data broadcasting application is interrupted in response to trigger information of the Suspend command. When the data broadcasting application has transitioned to the Active state, transition to the Ready state occurs if the running data broadcasting application is stopped in response to trigger information of the Terminate command. The transition to the Ready state occurs also when switching of video gets out of App_life_scope or when another data broadcasting application is activated. When the data broadcasting application has transitioned to the Suspended state, transition to the Active state occurs if the interrupted data broadcasting application is resumed in response to trigger information of the Execute command. When the data broadcasting application has transitioned to the Suspended state, the data broadcasting application transitions to the Ready state in response to trigger information of the Terminate command. When the data broadcasting application has transitioned to the Ready state, the Active state, or the Suspended state, if the retention time limit of this data broadcasting application passes, this data broadcasting application is erased from the recording part 71 and the registration thereof is deleted, so that transition to the Released state occurs. In the first embodiment, if a data broadcasting application activated once is stopped, this data broadcasting application is erased from the recording part 71. In contrast, in the second embodiment, a registered data broadcasting application is not erased until its retention time limit passes, even if it is activated and stopped. Therefore, the registered data broadcasting application can be so operated as to be activated and stopped plural times as shown in In the above, only the case in which a data broadcasting application is activated in response to trigger information as the Execute command is described. However, it is possible to activate a data broadcasting application without using trigger information as the Execute command. Specifically, as shown in First, the following pieces of trigger information are broadcast: trigger information as the Execute command for the data broadcasting application app0, trigger information as the Register command for the data broadcasting application app1, and trigger information as the Register command for the data broadcasting application app2. In the receiving device 60 that has received them, the data broadcasting application app0 is acquired and registered to be activated. On the screen of the executed data broadcasting application app0, icons corresponding to the data broadcasting applications app and app2, respectively, are displayed. Simultaneously, the data broadcasting applications app1 and app2 are acquired and registered. If the user selects the icon corresponding to the data broadcasting application app1 displayed on the screen of the data broadcasting application app0, the running data broadcasting application app0 is stopped and the data broadcasting application app1 is activated. Thereafter, event firing, suspension, resume, and stop of the data broadcasting application app1 are carried out in response to the Inject_event command, Suspend command, the Execute command, and the Terminate command, respectively, for the running data broadcasting application app1. If the user selects the icon corresponding to the data broadcasting application app2 displayed on the screen of the data broadcasting application app0, the running data broadcasting application app0 is stopped and the data broadcasting application app2 is activated. Thereafter, event firing and stop of the data broadcasting application app2 are carried out in response to the Inject_event command and the Terminate command, respectively, for the running data broadcasting application app2. According to the above-described operation, it is possible to activate plural data broadcasting applications in linkage with each other without using trigger information as the Execute command. As described above, in both the first and second embodiments, processing relating to the data broadcasting application can be executed in conjunction with AV content of show, CM, etc. Also when a digital television show is retransmitted via e.g. a CATV network or a satellite communication network, service of data broadcasting content that can be in conjunction with the television show can be realized. The above-described series of processing can be executed by hardware and can be executed also by software. If the series of processing is executed by software, the program configuring the software is installed from a program recording medium into a computer incorporated into dedicated hardware or e.g. a general-purpose personal computer that can execute various kinds of functions through installation of various kinds of programs. To the bus 104, an input/output interface 105 is further connected. To the input/output interface 105, the following units are connected: an input unit 106 composed of a keyboard, a mouse, a microphone, etc., an output unit 107 composed of a display, a speaker, etc., a storage unit 108 composed of a hard disk, a non-volatile memory, etc., a communication unit 109 composed of a network interface, etc., and a drive 110 that drives a removable medium 111 such as a magnetic disk, an optical disk, a magnetooptical disk, or a semiconductor memory. In the computer having the above-described configuration, the CPU 101 loads a program stored in the storage unit 108 into the RAM 103 via the input/output interface 105 and the bus 104 and executes the program for example, and thereby the above-described series of processing is executed. The program executed by the computer may be a program processed in a time-series manner along the order described in the present specification, or may be a program processed in parallel or at the necessary timing such as when a call is made. The program may be one processed by one computer, or may be one subjected to distributed processing by plural computers. Furthermore, the program may be one transferred to a distant computer and executed by the computer. In the present specification, the system refers to the whole device composed of plural devices. Embodiments in the present disclosure are not limited to the above-described embodiments and various changes can be made without departing from the gist of the present disclosure. Disclosed herein is a transmitting device including an audio encoder and a transmitter. The audio encoder is configured to generate an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in a receiving device is buried. The transmitter is configured to transmit the generated encoded audio stream to the receiving device. 1. A transmitting device comprising:
an audio encoder configured to generate an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in a receiving device is buried; and a transmitter configured to transmit the generated encoded audio stream to the receiving device. 2. The transmitting device according to a controller configured to supply metadata in which the trigger information is stored and size information for burying the metadata in a user data area of the encoded audio stream, and carry out control so that the metadata be buried in the user data area. 3. The transmitting device according to the audio encoder encodes an audio stream by an audio code number 3 system to generate the encoded audio stream, and the metadata is inserted in an area of auxiliary data in a frame structure of the audio code number 3 system. 4. The transmitting device according to the audio encoder encodes an audio stream by an advanced audio coding system to generate the encoded audio stream, and the metadata is inserted in an area of data stream element in a frame structure of the advanced audio coding system. 5. The transmitting device according to a video encoder configured to encode a video stream to generate an encoded video stream; and a multiplexer configured to multiplex the encoded audio stream and the encoded video stream to generate a multiplexed stream, wherein the transmitter transmits the generated multiplexed stream. 6. The transmitting device according to type information indicating a type of information is added to the metadata. 7. The transmitting device according to a plurality of kinds of information distinguished by an information identifier are included in the metadata. 8. A transmitting method of a transmitting device that transmits content, the method comprising:
generating an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in a receiving device is buried; and transmitting the generated encoded audio stream to the receiving device. 9. A program for controlling a transmitting device that transmits content, the program causing a computer of the transmitting device to:
generate an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in a receiving device is buried; and transmit the generated encoded audio stream to the receiving device. 10. A receiving device comprising:
a receiver configured to receive an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content is buried, the encoded audio stream being transmitted from a transmitting device; an audio decoder configured to decode the received encoded audio stream; and a controller configured to control processing relating to the application program executed in conjunction with the content in response to the trigger information obtained by decoding the encoded audio stream. 11. The receiving device according to the audio decoder acquires the trigger information stored in metadata from an area of auxiliary data in a frame structure of the encoded audio stream encoded by an audio code number 3 system. 12. The receiving device according to the audio decoder acquires the trigger information stored in metadata from an area of data stream element in a frame structure of the encoded audio stream encoded by an advanced audio coding system. 13. The receiving device according to a demultiplexer configured to demultiplex a received multiplexed stream; and a video decoder configured to decode an encoded video stream demultiplexed from the multiplexed stream, wherein the audio decoder decodes the encoded audio stream demultiplexed from the multiplexed stream. 14. A receiving method of a receiving device that receives content, the method comprising:
receiving an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content is buried, the encoded audio stream being transmitted from a transmitting device; decoding the received encoded audio stream; and controlling processing relating to the application program executed in conjunction with the content in response to the trigger information obtained by decoding the encoded audio stream. 15. A program for controlling a receiving device that receives content, the program causing a computer of the receiving device to:
receive an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content is buried, the encoded audio stream being transmitted from a transmitting device; decode the received encoded audio stream; and control processing relating to the application program executed in conjunction with the content in response to the trigger information obtained by decoding the encoded audio stream. 16. A broadcasting system comprising:
a transmitting device configured to transmit content; and a receiving device configured to receive transmitted content, wherein the transmitting device includes
an audio encoder that generates an encoded audio stream in which trigger information relating to control of an application program to be executed in conjunction with content in the receiving device is buried, and a transmitter that transmits the generated encoded audio stream to the receiving device, and the receiving device includes
a receiver that receives the encoded audio stream transmitted from the transmitting device, an audio decoder that decodes the received encoded audio stream, and a controller that controls processing relating to the application program executed in conjunction with the content in response to the trigger information obtained by decoding the encoded audio stream.CROSS-REFERENCE TO RELATED APPLICATION
BACKGROUND
SUMMARY
BRIEF DESCRIPTION OF THE DRAWINGS
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
1. First Embodiment
[Configuration Example of Broadcasting System]
[Configuration Example of Broadcasting Device]
[Configuration Example of Receiving Device]
[Transmitting Method of Trigger Information]
[Metadata Generic Syntax]
[Details of Trigger Information]
[First Example of Five Kinds of Commands]
[Outline of Operation of Receiving Device 60]
[Trigger Information Transmission Processing]
[Trigger Information Response Processing]
[Operation Scenario]
[First State Transition of Data Broadcasting Application]
[State Transition of Plural Data Broadcasting Applications]
2. Second Embodiment
[Second Example of Five Kinds of Commands]
[Explanation of Application Registration Management Processing]
[Second State Transition of Data Broadcasting Application]
[Life Cycle of Data Broadcasting Application]
[Operation of Plural Data Broadcasting Applications]











































