In this workshop we will look at how we can previsualize projectors and create a simulation of the real world using the ArcBallCamera component from the TouchDesigner palette in addition to a pre-built projector component.
Using the projector component participants will learn how to establish the perfect position for their projector. We will review the variables that can make a projection look good or bad. Factors including lux, millimeters per pixel, projection angle, and general best practices will be explained. Together, we will also discuss how simulation can help to avoid elements that can impact the quality of a projection.
Once this foundational material is covered we will look at how to set up a “sweet spot” camera to view a projection by using the Texture SOP. The instructor with describe the UV workflow and will discuss the best approaches to use when working with UVs. From here we can make some visuals from our sweet spot camera which can then be re-projected onto our 3D set.
As a final step in the process we will use the CamSchnappr component from the palette to re-project our virtual 3D setup onto the real world object. Here we will see how the projection matrix can be determined via CamSchnappr and examine how a 3D scan workflow can assist in this type of scenario.
We’ll finish the workshop by comparing each other’s visuals and reviewing some simple SOP tricks to make classic projection mapping effects like trace lines.
Have you ever played the game The Incredible Machine (TIM)? Or maybe you’ve aligned some dominoes to create a mesmerizing chain reaction?
This session will highlight TouchDesigner's flexibility to create amazing in situ installations by combining physical ingredients (projectors, motors, smoke, LEDs, cardboard, tape) with digital ingredients (3D shapes, particle systems, animations, and text).
Participants will have the opportunity to prepare for the session in advance. The idea is to quickly brainstorm and start prototyping as a team by splitting up the different components of the final installation. The final result will be a friendly chain reaction. An exquisite corpse of poetic combinations of craft and code.
In this workshop Ronen Tanchum will present a variety of devices that can be connected to TouchDesigner and will outline different ways to use their incoming data. The workshop will begin with an overview of the mechanical ways to let TouchDesigner coordinate scenes with no human input and then will explore how to control scenes with simple keyframe animations. The workshop will also teach participants methods for automating animations with TouchDesigner.
Using a visual example in TouchDesigner the group will add human input from the various controllers. Participants will guide all the algorithms, nodes, and calculations in real time with various input methods that Tanchum has used in real productions.
For example, recording simple animations for objects using VR controllers and trackers can be used for mixed reality. The instructor will show how 3D scenes can be visualized and how to navigate virtual worlds by attaching lights to hand controllers in order to act and record performances. Finally, attendees will learn how to record camera movements and how to post-process them.
In this workshop Peter Sistrom will share his experience devising a flexible system for media sequencing and playback in many different and demanding scenarios over the last six years. Over time he has become familiar with the evolving set of features in TouchDesigner that have allowed this system to come to life and be repurposed in a range of projects that include: earth shaking visual music shows, bombastic interactive performances, and completely rock solid long term media playback installations. All the while he’s dreamed of rounding out the foundation of these tools to the point of being able to show, share, discuss, and evolve them with the larger TouchDesigner community.
In the session Sistrom aims to describe the system he has developed and the reasons why he believes this infrastructure is useful in confronting different scenarios while striving to remain as light and flexible as possible.
Additional topics that will be explored include using tags for subjective description, using touch network hierarchies for tree browsing nested show assets, media management, techniques for timeline interfacing, and back-end methodologies that can be used in system building. These themes will be drawn together by an explanation of how these assets can be used in runtime systems.
Ultimately, this session hopes to inspire, add to, and continue the conversation around how the TouchDesigner community can start to share, develop, and enhance each other's foundations so we can all keep pushing the envelope!
This workshop will focus on a variety of aspects of character animation. We will begin with a discussion of the common paradigm for character animation in 3D animation software. Next, we will discuss how to import rigs from FBX, where to get them, and how to work with them. Participants will learn how to setup their own deforming rig with existing geometry, how to setup and tune bone hierarchy, and how to use inverse kinematics and other types of goal-oriented bone control.
Attendees will also learn how to scan themselves using photogrammetry, how to rig and get themselves animated, and where to get and how to implement ready-made animations. This section will expand on how to transfer animation made for a particular rig to another. Animation rigs can be used not just for character animation, but for geometry-fun. We will review tips and tricks for working with rig animation data using TouchDesigner.
Finally, workshop participants will go over how to make a patch to control characters using Kinect CHOP and how to blend different animations. The workshop will end with a group discussion and the sharing of ideas related to using animated rigs in VJ setups and other ways to record character animation.
This workshop is for those who understand the basic operation of TouchDesigner and are interested in learning how to create audio-reactive video that can be controlled by UI. Through this workshop, participants will gain a base knowledge of VJ'ing techniques using TouchDesigner. The first half will cover simple elements such as TouchDesigner's standard grid, sphere, circle, and box. The second half will make use of 3D objects and Mixamo data to create more advanced visuals.
This workshop uses the new Bullet Solver in the latest experimental version of TouchDesigner to explore different ways of creating physics systems that both react to and produce sound through a TDAbleton connection. Through this session attendees will come to understand how the tight integration of sound and visuals can bring a synesthetic edge to real-time art.
In this workshop attendees will learn how to convert Bullet's collision triggers to velocity sensitive MIDI data, map reverb, and effects to the speed of the physics simulation to create dream-like bullet time effects. Participants will learn how to add forces triggered by MIDI notes which push particles away depending on the velocity of a key press. A variety of other methods will be discussed that can be used to manipulate visuals with audio (and vice versa).
With these techniques participants will be able to turn the simplest visual works into projects that have double the impact as a result of their tight integration with sound.
Have you ever wanted to make your own trippy old school visuals with a modular video synthesizer, but never had enough money to buy expensive hardware? In this workshop, Jonathan Thompson (@pointshader) will guide you through the basics of creating your software modular - one where you’ll never have to pay for an extra module! The magic of old school video synthesis lies in the physical properties of a CRT monitor, so after an overview of the underlying physics, the workshop will open into a coding session that will teach you how to emulate a CRT monitor in GLSL, line by line. The rest of the workshop will focus on creating modules for our new system. You can expect to learn about creative uses of TOPs and feedback loops, along with good practices for system architecture using containers and custom parameters.
A Content Management System (CMS) is the optimal way for running a permanent installation. It enables full control over everything via a website, from anywhere.
In this workshop Javier Alvarez Bailen will provide a basic example that demonstrates how to manage a PHP based CMS and connect TouchDesigner with an online database. Participants will learn to customize their actions in order to change content and activate cameras and different devices.
An exciting feature of the CMS is a scheduler that lets you define when which content should play where.
Building complex systems is challenging. Keeping track of network signal flows and inner-system dependencies is an art in its own right. In this workshop you will learn how to manage broad system functionality with multiple inputs and outputs. This workshop will cover topics that include network compartmentalization and layout, extensions, custom parameters, and Parent/OP Shortcuts. A basic knowledge of Python is required.
This workshop will walk participants through the process of creating real-time generative visuals using two types of systems: reaction-diffusion and cellular automata. Both methods yield a variety of complex and often non-repeating patterns.
Reaction-diffusion systems model the changing of one or more chemical substances. The examples that will be addressed in the workshop include beginner level TOP-based methods and a brief discussion of intermediate level techniques using GLSL. Cellular automata are simulations consisting of grids of cells which change their state based on simple rules. Participants, along with the instructor, will build 1D and 2D cellular automata systems from scratch using GLSL and will look at examples which incorporate decaying cells (or the “generations” variation) along with methods for utilizing multiple rule-sets in one system.
These patterns, while compelling on their own, can also be used as the input to other systems. This session will help participants understand how to use their motion to drive particles systems and manipulate geometry, in addition to exploring how to add interactive elements to these systems, such as audio-reactivity and motion control from a Kinect or Leap Motion. Above all, the workshop will expose participants to an iterative approach to creating generative visuals, exploring ways of reusing concepts and techniques in new ways.
This workshop is an exploration into projection warping with a focus on fulldome projection mapping. This session will benefit individuals with an interest in domes, projection, pixels, and immersive content playback. Time will be dedicated to understanding, designing, developing, and implementing usable fulldome mapping tools that will broaden participants’ horizons in terms of what can and cannot be done in the projection mapping world. Why pay for expensive software when you can do it yourself?
The first half of the day will explore the implementation of a 3D scene pre-visualization and the methods required to capture a dome correctly in order to to allow us to project back into one. The second half of the day will focus on re-projecting the captured content and manipulating the projected image to create the perfect mapped dome. By the end of this workshop users will have hands-on experience with from-scratch projection mapping skills, pixel mapping, manipulation tools and will hopefully come away with one mapped dome!
The proliferation of easily accessible, networked technologies has impacted workflows for artists and curators all over the world. Contemporary communications strategies allow art industry stakeholders to quickly resolve challenges, to increase their mobility, and to forge strong bonds between countries, organizations, and each other.
These bonds trigger inspiration between arts professionals situated within a diversity of cultures while forming a basis for expansion and the creation of a new language in the digital arts. The development of this new language enables artists and other professionals in the field to share their unique ideas and values while making space for new and exciting collaboration opportunities.
The built-in Python libraries in TouchDesigner present a wide array of tools and mechanisms for creators to extend their workflow. What if you want to control devices like Philips Hue lights, create SVGs, or post to Instagram? External Python libraries can increase the reach of your work and expression, but it takes a bit of effort to seamlessly integrate them into your network.
In this session participants will first look at the process of adding an external Python library into the traditional Pythonic workflow. This initial exploration will be followed with a description of tools and best practices for simplifying that workflow, allowing participants to create transportable TOX files that ensure that Python dependencies travel seamlessly from project to project.
This workshop will fast-track game and visual effects artists interested in creating interactive scenes with TouchDesigner using 3D animations, textures, lights, and controls. Using TouchDesigner, workshop participants will integrate audio-driven animation, live controls, and many other types of inputs to drive their scenes.
Think of TouchDesigner as a real-time engine for 3D graphics and 2D compositing at the same time. It's also an application-building framework complete with interface building tools and scripting with Python. It is a node-based workflow, so it requires building both the scene and the controls needed to animate and interact with the scene. These controls can be automatically driven by an ever-increasing library of inputs and software development kits.
Derivative has recently incorporated a variety of game and VFX-familiar technologies, including a revamped physically-based rendering engine, SBSAR materials, GLSL shaders, and the ability to import models and animations as FBX, Alembic, or OBJ.
As a VFX or game artist, the most rewarding aspect of learning TouchDesigner is its ability to breathe new life into your existing skill set. Workshop participants can expect to experiment with and create interactive experiences you never thought possible. Those thinking of incorporating TouchDesigner into an existing VFX pipeline, to leverage the GPU for automated compositing and/or rendering tasks, will also benefit from this session.
By the end attendees will have an understanding of how to import assets and how to setup scenes, controls, and a basic render engine. The workshop will close with a discussion of best practices for scene optimization and performance.
In this workshop participants will acquire a basic understanding of how GLSL shaders operate within TouchDesigner. Additional information around their syntax and common use will be reviewed. This will be achieved through the creation of two real-life projects (an analog style synthesizer and a 2D pattern generator). The instructor will be present to guide students throughout the process. The main concepts of the workshop will be discussed and demonstrated at the beginning of the class and the rest will present themselves as the class unfolds.
This workshop is intended for programmers that want to deepen their understanding of computer graphics, for designers that want to learn how things are made, and for anyone curious enough to see the way pixels are drawn on a screen. A basic understanding of programming is required. Some experience with computer graphics will be useful.
In this workshops is focused in deepening the understanding of the GLSL language use cases within TouchDesigner. Both GPU based particle systems and basic Raymarching techniques will be covered, which are intermediate uses of shader programming and serve as support to explain how to take the most out of your computer’s graphic processor.Participants will create 2 real-life projects to demonstrate the techniques and theories explained. This workshops is intended for programmers with some knowledge of the graphics render pipeline.
In this workshop, we will review advanced techniques using GLSL shaders in TouchDesigner. Through the use of modern GPUs participants will learn how to gain flexibility and increase customization.
In the first section of this workshop, the instructor will explain how compute shaders differ from other types of shaders. Next, the presenters will create a number of practical examples that make use of shaders in TouchDesigner, demonstrating how to achieve better control over particle systems and their optimization. Participants will also learn how to create more efficient 3D textures and how to manipulate them.
In the second part of the workshop, the instructors will explain how to use raymarching as a tool to visualize 3D textures and as an alternative to polygon rendering to create and render organic shapes. A discussion on mixing the traditional TouchDesigner rendering pipeline with raymarching and accessing the TouchDesigner cameras and lights in the raymarched scene will close the workshop.
This workshop aims to demonstrate a new way of manipulating SOPs in TOP space, while preserving optimized performances, by leveraging the GPU through geometry shaders to convert an SOP into, or from, a TOP. The geometry shader will be presented and explained first through a series of simple and practical examples, to illustrate its unique capabilities in comparison to vertex and pixel shaders, and how they may be used to accomplish our goal.
Using this knowledge, we will build a GPU accelerated tox responsible for converting an animated SOP into a series of TOPs, storing vertices attributes and connectivity that vary over time. Then another component will use those very TOPs to reconstruct the geometry by using another geometry shader.
Finally, we will explore how to generate, modify, and combine different geometries in TOP, leveraging the power of pixel shaders to do so. Some tools will be presented that show how to apply this technique to point cloud data or kinect recordings.
Widgets are a new and exciting way to create highly customizable interfaces. Even though widgets can effectively reduce a creator’s workload it can still be hard to build big and complex interfaces by hand. In this workshop we will take a deeper look at ways of automating interface creation for the users themselves and for quick prototyping.
This workshop will outline widgets and binding and will address network organization using custom comps, external tox files, and OPShortcuts. Additionally, we will review network analysis and comps, learn how to create new functionality with the use of extensions, outline how to use drag and drop, take a deeper look into complex Python scripting with class-extension and error-handling, and master creating and recalling presets.
When Walter Gropius founded the legendary design school Bauhaus one hundred years ago in the German town of Weimar, the world had been turned upside down. Industrialization had severely changed people's everyday lives and WWI had shown the brutal potential of these new technologies. At this time, design and architecture had been struggling to find a new language that was appropriate for these radically changing means of production. In this fragile situation the architect Walter Gropius managed to gather some of the finest artists and designers of his era and founded a new school where art and design were taught in a completely new way.
How could the Bauhaus achieve this? With the Bauhaus Manifesto, Gropius took inspiration from the idea of the Bauhütte, the mystical design studio of the ancient cathedrals, where all crafts were united to make one holistic vision come to life. At the same time, students of the Bauhaus were encouraged to take matters into their own hands. In different workshops they were free to experiment with craft techniques and as a result produced a range of experimental works. This approach allowed students and masters to quickly overcome traditional thinking about form, color, and space and extract from this creative exploration what would come to be known as the International Style, the first global design language that continues to shape our reality today.
One hundred years later, we find ourselves in a similar situation. New technologies demand new design solutions yet specialization will not necessarily lead us down the right path. Much like the Bauhaus dreamed of art, life, and craft as a holistic gesamtkunstwerk (total work of art), working with TouchDesigner allows us to integrate all art displicines into one experience that permeates both the real and the virtual.
In an homage to the early Bauhaus, instructor Stefan Kraus will guide workshop participants through five exercises, that speculate on how five famous Bauhaus masters would have used TouchDesigner to implement their ideas and teachings.
In this workshop participants will learn how to create kickass generative projects with an emphasis on fun. The pros and cons of using TouchDesigner for creativity will be discussed as well as what the best strategies are for going from sketches to results in as few steps as possible. Examples of how to get inspiration from recent works; not getting stuck while trying to use the right '1 year old toe'; useful tricks for productivity; how to avoid boring stuff; and how to get intermediate results with ease will be covered. Finally there will be a discussion about what the Zerro tool is and what it solves in TouchDesigner.
In this workshop, participants will explore a number of creative methods, interfaces, and sensors for controlling live elements with TouchDesigner. Devices and features including MIDI controllers, Leap Motion, Kinect, Mi.Mu Gloves, as well as OSC and MIDI messaging protocols will be explored. Instructor, Synthestruct (aka. Ginger Leigh) will explain how TouchDesigner can employ these messaging protocols to communicate directly with Ableton Live in order to create visuals that respond to audio in Ableton, in real time. Participants will also receive tips for conceptualizing their ideas and optimizing their live control setup to reduce the risk of problems during a live performance.
After taking an in-depth look at several creative examples, workshop attendees will work together in pairs or small groups to create their own project using one or more of the control methods covered in the workshop.
This workshop is an introduction to the world of C++ CHOP programmed specifically for TouchDesigner. At the beginning of this workshop participants will gain an understanding of the custom C++ CHOP example source code while reviewing its main accessible methods.
Afterwards the group will review the design of a very simple module that will allow us to send data to TouchDesigner through the CHOP we will have created. At this stage participants will tackle the heart of a common problem together: the integration of a special sensor for TouchDesigner.
The presenters will demonstrate how to pair the Visual Studio debugger with TouchDesigner, how to use it, how to communicate with the sensor, and then how to send the data into TouchDesigner without causing any threading problems. By the end of this session attendees will have a global view of the design of a CHOP module in C++.
Unfortunately, this workshop is mainly intended for participants using Windows (because of the emphasis on Visual Studio and the instructor’s experience with this operating system). An advanced Mac OS X user with a solid knowledge of C++ could also join the workshop.
This workshop introduces participants to strategies that can be applied to TouchDesigner when authoring responsive systems and dynamic content. Participants will learn how to leverage Python members and expressions available to operators such as me.digits, n.numSamples, or me.inputPoint.index. Additional workflow tips will be provided in order to help developers better organize their programs and ensure the optimal use of inputs and properties that change in real time. The flexibility and modularity benefits of a dynamic approach will be outlined and contrasted against static content where manual changes are required to implement new conditions.
Houdini is the grand-daddy of Touchdesigner, and it's a great choice for a daily driver DCC if you are focusing on real-time graphics rendering. This lecture will go over what a pipeline is, what they're made of, how I design them, and how Houdini and Touchdesigner can work together to put the pixels on the screen as fast as possible. The context for the projects I'll be showing will be asset creation for Extended Reality (xR), but the pipeline-oriented design methods I use for my xR work can serve anyone that needs to prioritize high levels of art direction at the same time they prioritize speed. We'll also be going over the new PDG/TOPs features inside Houdini's 17.5 release, and some tactics you can use for rolling them into your current studio pipeline.
This session will feature a conversation about higher level principles and workflows. Much of the discussion will be based on challenges and solutions that were encountered throughout the development of GeoPix 1.0. Together the group will dive into a simpler isolated .TOE. Attendees can follow along to explore and see directly some of the issues and solutions for a variety of mock scenarios that can arise when programming large-scale projects.
Controlling laser show projectors via the ILDA protocol is a challenging but rewarding approach. Tim Greiser, co-creator of Laser Juice, will break down what you need to know to get started using TouchDesigner with Ether Dream or Helios DACs in order to produce beams, abstracts, or interactive tools with laser light. Greiser will provide theory, terminology, protocol and hardware details, and will provide example code for working with SOPs, CHOPs, or TOPs. The pros and cons of each domain will be discussed. Participants will learn about topics like scan rate, blanking, color signal techniques, and vector scanning.
This will be a hands-on session where small groups will form to build projects on virtual scopes that will eventually be displayed with lasers. At the end of the workshop the group will engage in a discussion about the results and challenges of their experimentation.
This workshop will revolve around three projects that were developed collaboratively by Rémi Lapierre, Yan Breuleux, and Alain Thibeault. The technical aspects of interfacing TouchDesigner with different devices for live performances will be presented.
This workshop will address the following projects:
Les planètes, a live concert performed in a dome with piano accompaniment. This piece involves interfacing TouchDesigner with a concert piano (Disklavier), drawing live in the dome using the HTC VIVE, and simulating the dome in VR with the HTC VIVE for testing.
Enigma, a live performance that involves nine screens. In relation to this project the presenters will discuss the process of interfacing controls with TouchOSC, interfacing with Live, controlling the performance with MIDI and OSC, and simulating the experience in VR.
StudioVR, a virtual cinema studio for creating camera animation for 3D movies using TouchDesigner and the HTC VIVE.
This workshop will close with instruction on importing, playing, and controlling assets from other 3D software by FBX in TouchDesigner in addition to mapping controls on HTC VIVE controllers for timeline controls, teleportation, playback recording, and exporting camera animation in other software (i.e. Maya).
Since the early 2000’s performers have been incorporating livecoding into their practices. In 2011, events that combined livecoding with dance and algorithmic music called algoraves emerged. Generally, livecoding involves improvising sound and video by programming on the fly. This practice has many practical applications in all sorts of creative coding work even outside of the field of performance.
In this workshop, we’ll work with code that modifies TouchDesigner networks in real time through Visual Studio Code. Starting with a basic 3D render, we’ll build up generator and effect functions, talk about code structure, and finish with a larger interactive network. Participants will leave the workshop with the skills to understand TouchDesigner networks as functions and variables and will be comfortable with experimenting with this alternate way of interacting with TouchDesigner.
After the workshop, participants will be encouraged to seek out their local livecode groups and incorporate livecoding into their practice as a way to prototype larger projects that are still in development.
Authoring large projects in TouchDesigner is an exciting task. It's often easy to get your first project off the ground, but as you complete more projects creators can start to find repetition in their work. Users of TouchDesigner can start to lose precious time by setting up the same things over and over again. Hopefully creators are learning lessons along the way about efficient methods and certain approaches that they prefer but how do you improve your workflow once you have a handle on the basics? What are some essential principles to hold onto? What are some best practices to keep in mind? How might we avoid creating the same pieces over and over again and make more time for the fun stuff - the art making and the creative work?
In this session instructors Matthew Ragan and Zoe Sandoval will describe TouchDesigner best practices and will leave participants with principles for efficient and reusable structures to use in their future projects.
This workshop will focus on Multi GPU System Design, implementation and execution with a focus on stability, performance and simplicity. Participants will examine the inner workings of a custom multi-node playlist component (Fusion Player)that plays back a playlists of cues (either pre-rendered content files or real-time components) across multiple nodes (ie. GPU's and or servers). In the workshop, methods for communication, synchronization and configuration of each node in the network will be discussed along with various methods and techniques to keep performance high while at the same time maintaining high functionality along with high render quality.
TD-centric advanced Python and GLSL techniques used to maintain simplicity and speed will also be covered.
At the end of the workshop attendees should have a solid understanding of the requirements of implementing and executing a large scale system as well as an understanding of the data model flow of the Fusion Player component itself. They will also be able to keep the Fusion Player component for future use and reference.
The aim of this workshop is to familiarize participants with the use of photogrammetry and to show how this technique can be applied within TouchDesigner using different methods and techniques including DATs, CHOPs, SOPs, Instancing, and GLSL materials.
This workshop will walk attendees through the steps of creating a 3D scan and will provide hands-on experience using a photo camera and photogrammetry technologies to scan physical objects and transform them into 3D models, from a set of ordinary images.
While using RealityCapture software (license will be provided), participants will be able to start a personalized project and later on will develop and render it in TouchDesigner. Afterwards we will use post fx to make your 3D scan look beautiful.
This workshop is designed for beginners/intermediate users and enthusiasts who are interested in 3D scans from photos and are eager to apply this technique to their projects. Sample files will be provided.
This half-day workshop for is meant for users who have a general understanding of the TD interface and its operators. Building on the concepts and techniques presented in the 2018 Berlin Summit Pixel Mapping Workshop, we will construct a flexible system for mapping and controlling LED installations. A variety of dynamic elements for generating content will be made accessible from an intuitive UI, allowing for a fluid workflow. These elements will touch on some generally useful practices such as 3D rendering, customized components, preset storage and recall, MIDI mapping, and GLSL. We will have some LED fixtures present for demonstration.
In a world where new design tools are constantly changing and evolving TouchDesigner creates space for artists and designers not only to create new works but to build their own tools. TD enables creators to use a generative frame-of-mind that can lead to different methods and original outcomes.
In this workshop Ben Benhorin will teach participants how to find their own original style and unique tone-of-voice through generative design while also considering the broader contexts of media and art. This workshop will offer methods to find focus and style in a world of endless possibilities. This session aims to simplify the complexity of an open platform like TouchDesigner. Its “open canvas” structure offers total creative freedom that can also be confusing. Participants will learn how to identify and use their own design patterns and favourite styles, how to refine techniques and parameters, and how to formulate them into a more structured system of reusable effects and components.
Borrowing from the world of guitar pedal effects, the presenter will create new effect racks that expose the core parameters through UI that drive the visual and hide what is not needed. By the end of this workshop participants will be able to create a set of tools that reflect their unique voices. They will learn to create visual worlds composed of simple rules that allow both complexity and creative freedom.
This workshop is based on the presenter’s long and intensive experience trying to improve the complexity and quality of the real-time graphics he creates in TouchDesigner. This session is oriented towards intermediate TouchDesigner users who have an understanding of Surface Operators and a handle on the basics of GLSL coding.
The instructor will cover the OpenGL rendering pipeline, how to create geometry on GPU, how to proceduralized code via custom data structures and functions, as well as the process for building shader builder and putting geometry back to CPU.
This workshop will begin with an introduction that addresses the various sensing technologies and input devices currently available to artists and designers. Then we will focus specifically on I-CubeX sensor technology.
In this hands-on workshop participants will learn how to add sensors to a TouchDesigner project, using an I-CubeX kit (provided by Infusion Systems). Using their own kits participants can apply instructions to make their Arduino or Raspberry Pi compatible with I-CubeX technology. Attendees will learn the capabilities of I-CubeX including what the sensors can capture and how they can be applied, how to set-up a number of sensors and prepare them for use with TouchDesigner, and how to route and apply sensor messages in TouchDesigner.
A number of examples that show the application of sensors in a TouchDesigner project will be available for participants to try out. Participants can also work on specific sensing needs for their own project. All I-CubeX products will be available to participants to try out during the workshop.
Data visualization is a powerful tool for visual communication that balances design and technique and is made all the more relevant by today’s data-driven world. Visualizing data with maps has a long history, from the public health breakthrough of John Snow’s cholera map to the call for social justice represented by Ekene Ijeoma’s Refugee Project.
TouchDesigner is a powerful tool for 3D and spatialized data. In this workshop participants will learn how to represent data associated with geographic information and how to make that visualization interactive using TD. Participants will learn best practices in data visualization, handy techniques like isometric visualization, heat maps, and using text in 3D. We will explore how to use Python to parse data, how to create UI and interactive camera animations, how to use polar coordinates for representing data on a globe, and more.
This workshop will start with micro-examples of common data visualization techniques that will demonstrate how to translate and encode data into color, shape, size, arrangement, and how to transform these qualities over time. Next, the session will focus on representing a real data set on a globe through instancing, using materials and instance parameters (in order to vary the representations of data). Then the group will address generating and locating text with a 3D model. Finally, the instructor will describe how to create an interface for interactive navigation and browsable data layers.
This workshop will focus on system design for large scale installations. Notably, the session will review the basics of setting up a standardized project environment in order to manage multi-machine installations. This includes config-based dynamic loading/protocols for development and production environments, file structure/directory conventions, code externalization, and git versioning.
Further, the workshop will cover best practices for handling data. Key concepts will include an extension-based philosophy for structuring your network and code for maximum efficiency, using inheritance and component compartmentalization to isolate functionality across your system, reusable parsing and data routing methods, decoupling functionality from interfaces, and general component and module organization.
Other key points that will be addressed in this session include methods for data visibility – such as using StorageManager and persistent data, extension properties, and setters/getters for efficient code handling as well as using operator dependencies and dependable objects for reliable network data inputs.
Lastly, the instructors will discuss essential optimization knowledge to keep large scale installations running at peak performance. These include items such as start-up procedures, data caching, raster re-mapping, and additional tips and tricks learned from real-world integrations.
Machine Learning is taking over as a new means for creative and artistic practice. Many platforms already exist that make it relatively easy for programmers to use and integrate in projects. Nevertheless, as mere users of these platforms, we are often left with a superficial understanding of how the algorithms truly work and what the possibilities and potentials of a given model are. This workshop primarily aims to fill in some of these conceptual gaps and provide a gentle introduction to how machine learning works and how it can be implemented from scratch.
After this foundation is established workshop presenters, Darien Brito and TIm Gerritsen, will demonstrate TDNeuron, an open-source effort they want to share with the community. It offers enough tools to dive deeply into the algorithms of machine learning or to use them from a high level without worrying too much about the details. The instructors will show how these algorithms can be implemented and used natively in TouchDesigner.
TDNeuron will soon be open-sourced online. Brito and Gerritsen want to invite all enthusiasts of the project to contribute their ideas, remarks, comments, and code.
Since its invention in 1925 by M. Richard Gurley Drew (3M - Scotch), adhesive tape has become a medium of choice for a large number of artists. From the replacement of paint and colored glass to its own more contemporary style of tape art with projection aka : Tape Mapping. Tape art already sticks to the walls of several major international art galleries and museums. This workshop will be an introduction to this medium, its history, its contemporary applications and finally Guilllaume Bourassa's own artistic approach with the sticky ribbon.
Participants will explore the Kantan Mapper .tox that is already included in palette of TD99, which allows, amongst other things, to generate ''Tape Mapping'' type installations. Then it will be taken one step further and control of the content through the use of Python and an Android tablet with TouchOSC will make the project interactive. The end of the workshop will allow participants to create a collective work with the tape based on a pattern created by artist ThisIsNotDesign [TiND].
This workshop is for beginners and will provide attendees with an introduction to workflow, concepts, and techniques for using TouchDesigner. Participants will walk away with the skills they need to start creating interactive and generative projects.
Expect an introduction to the user interface and an overview of basic work concepts. You will get to know the different operator families, generative and instancing techniques, how to build visual effects modules, as well as how to output to projectors and multiple displays. The workshop will close with the presentation of examples that demonstrate practical explorations of TouchDesigner and performance strategies that will help you get started.
During this workshop participants will learn key techniques for putting together a complete electronic music and audio-synthesis workflow inside TouchDesigner through the use and study of several pre-built components. These will include midi sequencers, a polyphonic synthesizer, a drum-machine and sampler as well as various audio effects.Once participants are familiar with some of these basics, they will be given time to coordinate the development of an audiovisual jam/performance using these components and added functionalities.
Since the introduction of TouchDesigner for Macs many video designers working in the field of stage performance are eager to integrate the platform into their practice. As TouchDesigner becomes more widely used by individuals working in stage performance they are searching for solutions as issues, specific to the industry, arise.
In this workshop participants will learn how to adapt TouchDesigner for the production of dance and theatrical performances. We will examine how to collaborate with performance directors and stage and sound designers. Based on the instructor’s experience the workshop will include a discussion of production timelines, including rehearsal details and what to expect in terms of project adaptation and revisions. We will address the necessity and method of separating different parts of a performance into scenes, how to create a project that is reliable and easy to transfer to technicians, and how to create a UI that is usable in small spaces.
BeSide has worked with the team behind the innovative Looking Glass holographic display in order to build a TouchDesigner implementation. The Looking Glass displays true volumetric 3D imagery without the use of VR/AR headgear. Although this new technology is a natural fit for the TouchDesigner environment, initially only a Unity integration was made available. Dave Tennet and the BeSide team worked with developers at the Looking Glass Factory in Greenpoint, Brooklyn to build a port for TouchDesigner that allows any 3D geometry within TD to be rendered (with full texturing and lighting) within the volume of the Looking Glass display.
In this workshop, David Tennent will provide a brief overview of the technology behind the Looking Glass and the architecture of the TouchDesigner integration. Afterwards participants will work with the actual devices and their own computers, while running TouchDesigner, in order to learn how to set-up the system. Tennent and BeSide Co-Founder/Managing Director, Matthew Haberd will be available to provide tips and tricks on best practices for authoring 3D content and building TouchDesigner patches for use with the Looking Glass.
First came Katan, then came CamSchnappr. What's next? How can you calibrate even more devices together like cameras, projectors, and VIVE Trackers? Over the past few years Harvey Moon has been working on a number of different ways to calibrate faster and more easily using TouchDesigner. At the beginning of this talk Moon will explain how he calibrated a projector to the VIVE world coordinates by re-building CamSchnappr for single point input. After this, he will deconstruct CamSchnappr and add the Ret Error output. Moon will lead a discussion about the function of OpenCV’s calibrateCamera and how it can be used in different contexts.
Moon will also present a project that uses fiber optics to automatically calibrate a scale city model, with a live demo of four corner calibration on a fiber sensor array. In closing Moon will briefly describe how to calibrate the intrinsics of a camera lens using OpenCV in TouchDesigner in addition to showing a system that uses cameras and structured light to calibrate three projectors. Attendees will be invited to ask questions at the end of the presentation.