|MPEG-4 Editor and Server||
Due to the increasing of computing power, memory, and network bandwidth, more and more multimedia applications are required by the users, such as navigator guides, games, and product introductions. Until recently, as the multimedia applications become more complex than in the past due to the numerous inventions of multimedia technologies. Modern multimedia applications requires integration of various heterogeneous media "objects" into a single display frame, and allow those objects to cooperate with each other to present the final frame and to interact with the user actions. Those media objects include natural and synthetic content types and many different media file formats. The natural content types include image, frame-based video, and sample-based audio. The synthetic content types include 2-D shape, 3-D geometric object, text, MIDI, text-to-speech, and particle effect. The media content are not usually static and would change according to either the user actions or the environment (the time elapsing, for example). Therefore, the media content must include the interactive logic, which would be decided by the multimedia display programs or the script codes written by the content authors.
Therefore, the MPEG committee attempted to integrate state of the art multimedia technologies, which above mentioned, into one standard, as known as the MPEG-4 standard. MPEG-4 is the succeeding standard of MPEG-1 and MPEG-2 video standards. ISO standard committee designed this standard in 1998. Instead of using the current frame-based video technology, it adopts the object-oriented concept, which integrates the existing multimedia technologies, such as 2D/3D graphic, animation, video codec, multimedia streaming, interactive, and programmatic environment into single architecture.
MPEG-4, which includes the most of the newest generation of multimedia technologies, is the most universal multimedia standard in the world now. Owing to its enormous architecture and many system components, which are across many domains, it becomes hard to implement and has no real applications. To push this standard to be a success and widely used, it needs not only a complete implementation of the whole system but also abundant MPEG-4 contents.
|MPEG-4 Player||MPEG-4 is an interactive virtual environment. In this virtual environment, natural and synthetic hybrid media are mixed and transmitted over heterogeneous network. Base on the great success of MPEG-1 and MPEG-2 in the A/V market, the ISO MPEG group continues to develop their next standard - MPEG-4 that is expected to be completed by Dec. 1998. Under the trend of the integration of computer, communication, consumer electronic, and contents, MPEG-4 targets on these issues: authoring and encoding of object-oriented scene, encoding of natural and synthetic media, media streaming, uniform transmission interface in heterogeneous network environments. The goal of the Project MPEG-4 Scene Description Editor/Browser is to implment the Editor which can provide author the visual edit environment to edit the MPEG-4 scene and browse and play it immediately.||Finished||drliu|
|MPEG-1 Editor||Like the advent of desktop publishing in the
1980s, desktop digital video systems enable personal computer
users to perform video production tasks previously handled only by large studios and production firms. An essential part of post-production is editing, and precision tools software can keep editing exact and video well-paced. Users can combine video clips to produce a coherent sequence of images and sound to communicate messages, emotions, or themes. Audio is another essential and often overlooked spect of video editing. Audio effects, voice-overs, and musical soundtracks can be added during the post-production phase.
Visual special effects, usually generated in post-production, comprise several broad categories:
Compositing, Animation, Titles, Motion control, Transitions, Image processing are basic functions of the MPEG editing tool.
|Shared Application Framework||The shared workspace had been considered as a significant feature of the synchronous concurrent engineering environment. An efficient way to provide shared workspace is the shared window system (or shared application system) which lay between the applications been shared and the underlying window management system (WMS). With shared window system, the single-user applications can be transparently shared among multiple participants without any modification under collaboration environment. There are three different paradigms to implement a shared window system: event-sharing paradigm, UI-image-sharing paradigm, and request-sharing paradigm. This paper will discuss the characteristic and implementation differences among these three sharing paradigms, and then propose a generic shared window system architecture to embody all of them in single architecture. Two crucial function groups: sharing activity management and I/O redirection/reproduction, will be separately designed within this generic architecture. Several critical implementation problems such as: latecomer problem, application spontaneous sharing problem, and cross-platform sharing problem, will also be discussed. Finally, an application recorder/player system using request-sharing paradigm was proposed to illustrate the utilization of application sharing technique on non-CSCW scenario.||Finished||ssyu|