<iframe src="https://www.youtube.com/embed/ypmaDq060oY?feature=oembed" width="1200" height="600" frameborder="0" allowfullscreen="allowfullscreen"></iframe> \n \nThe Imagineering and the Technosphere is a UF Intersections project funded by the Andrew W. Mellon Foundation. \n \nIn the face of our growing technological dependencies, our intersections group explores how humans use technology intentionally and unintentionally to alter our physical world. The group will study the accelerating pace of social technologies, such as the Internet and Artificial Intelligence. \n \n<iframe src="https://www.youtube.com/embed/ckNEjNj5Lps?feature=oembed" width="1200" height="600" frameborder="0" allowfullscreen="allowfullscreen"></iframe> \n \nThe group aims to discover what the lessons of past inventions can teach us about how to address the problems facing humanity today, particularly as they emerge in the “technosphere,” the landscape shaped by human hands. The group will develop an interactive website, host a regular research workshop, and organize events with speakers and filmmakers about technoscience. To engage students with questions of space, place, and time, a faculty member will work with students to build a mobile app for time travel in augmented reality to examine the hidden role of technology on the UF campus. This work will lay the foundation for team-taught and other new courses that will give students the tools to envision how they will “imagineer” the future of the planet while harnessing the power of technologies in environmentally and socially sustainable way. \n \nIn Spring 2020 our team will offer a course titled “Imagineernig the Technosphere”. The purpose of this course is to respond to the grand challenge question: “How do technologies influence our lives, then and now?” from the perspectives of our 6 thematic units: 1) Inventions and Sciences, 2) Spaces and Infrastructure, 3) Past and Future, 4) Imagining and Designing, 5) Conservation and Sustainability, 6) Culture and Society. This interdisciplinary approach will equip the students with foundational knowledge and tangible skills through weekly modules and experiential learning activities that will be organized as part of the <a href="https://research.dwi.ufl.edu/projects/technosphere/index.php/uf-quest-game/">“UF Quest Game”</a>, a gamified learning experience specially designed for this course. The students will be able to transcend the boundaries of traditional disciplines and demonstrate how the humanities serve as the foundation for understanding science and technology and how this holistic approach could affect our decision making processes in ourselves, and on a planetary scale. \n \n \n<strong>In the humanities class “Imagineering the Technosphere,” homework isn’t based on a book chapter, but an adventure through campus guided by the GPS-powered Time Traveler app.</strong> \n \nSisters Christine and Reyna Mae Cuales, both taking the class this semester, followed the prompts on the app, which steered them closer to their destination. So far, they’ve visited the Harn Museum of Art, the McKnight Brain Institute, the Baughman Center on Lake Alice and the <a href="https://digitalworlds.ufl.edu/institute-information/facilities/">Digital Worlds Institute in Norman Hall</a>, among others. Today, they’re closing in on a location near the historic central campus. When students successfully navigate to the mystery location using the app, a screen pops up that tells them they’ve arrived, offers some background about the place, and poses a reflection question about the place and its use over time. \n \nProfessor Angelos Barmpoutis says the intention of the app — and its corresponding board game — is to get students to see their surroundings in a new way. \n \n“These places are deeply connected to the past and tied to the future,” he said. “I’m trying to get them to think about the things they pass every day.” \n \nWhen students followed the app to the Norman Gym, for example, they saw a facility originally used for basketball, its wood floors still visible, hosting a weekend-long <a href="https://globalgamejam.org/2019/jam-sites/university-florida-digital-worlds-institute">video game design competition</a>. The experience gave them an opportunity to reflect on how not only the space but the nature of sports and competition evolved, Barmpoutis said. \n \nBarmpoutis is one of <a href=https://research.dwi.ufl.edu/projects/technosphere/Technosphere_Spring2020_SyllabusV1.2_AB.pdf>seven professors</a> who team teach the class, tackling fields that range from anthropology to historic preservation. Each professor’s lesson includes the hunt for several of the game’s 22 3D printed pieces, which students collect after finding the location and submitting video responses to the questions posed in the app. \n \n<iframe src="https://www.youtube.com/embed/B230Yo5SVmY?feature=oembed" width="1200" height="600" frameborder="0" allowfullscreen="allowfullscreen"></iframe> \n \nWalking north on Buckman Drive, the Cuales sisters can see that they’re getting closer to today’s location. Then they cross the street, and app tells them they’ve arrived. It’s Dauer Hall, an Collegiate Gothic brick building from 1936 with arches, bay windows and stained glass that once served as the student union. They record their video response about connection and continuity in historic places, then check the app for their next destination. \n \nReyna Mae, a pre-health student, and Christine, who’s studying sustainability and the built environment, both say they have discovered academic interests they wouldn’t have known about without the course. \n \n“When you’re just focused on your major, you don’t get to explore other classes,” Christine said. “Getting to know all of the professors and fields in this class opens up your eyes.”
Imagineering the Technosphere
The Imagineering and the Technosphere is a UF Intersections project funded by the Andrew W. Mellon Foundation. In the face of our growing technological dependencies, our intersections group explores how humans use technology intentionally and unintentionally to alter our physical world. The group will study the accelerating pace of social technologies, such as the Internet and Artificial Intelligence.
Java For Kinect (J4K)
<iframe src=https://www.youtube.com/embed/q0K4Y4g-hj0?feature=oembed width=1200 height=600 frameborder=0 allowfullscreen=allowfullscreen></iframe> \n \nThe J4K library is a popular open source Java library that implements a Java binding for the Microsoft's Kinect SDK. It communicates with a native Windows library, which handles the depth, color, infrared, and skeleton streams of the Kinect using the Java Native Interface (JNI). \n \nThe J4K library is compatible with all kinect devices (Kinect for Windows, Kinect for XBOX, new Kinect, or Kinect 2) and allows you to control multiple sensors of any type from a single application, as long as your system capabilities permit. For example you can control three Kinect 1 sensors, or one Kinect 1 and one Kinect 2 connected via USB 3.0 to the same computer. Furthermore, the J4K library contains several convenient Java classes that convert the packed depth frames, skeleton frames, and color frames received by a Kinect sensor into easy-to-use Java objects.
Digital Epigraphy and Archaeology Project
<iframe src=https://www.youtube.com/embed/8_KE--_pbzE?feature=oembed width=1200 height=600 frameborder=0 allowfullscreen=allowfullscreen></iframe> \n \nDEA is an interdisciplinary project initiated by scientists from the Digital Worlds Institute and the Department of Classics at the University of Florida. The goal of the project is to develop new open-access scientific tools for the Humanities and apply concepts from digital and interactive media and computer science to Archaeology and Classics. In our web-site you can view our 3D collections and interact with our on-line exhibits, read about our recent results, find interactive demos of our projects, and learn more about our future research directions. \n \nBringing together Digital Media, Computer Science, and the Humanities. \n \n<iframe src=https://www.youtube.com/embed/yh6MyLLFSTo?feature=oembed width=1200 height=600 frameborder=0 allowfullscreen=allowfullscreen></iframe> \n
<header> \n<h3><span style=font-size: 16px;>The UF Clinical and Translation Sciences Institute Funds Interactive Rehabilitation Environment</span></h3> \n</header> \n<div class=main> \n \nThe UF Clinical and Translation Sciences Institute awards a $7,500 grant to support a pilot project for promoting walking recovery and enhancing the sensory input in kids with spinal cord injuries. Digital Worlds Professor Angelos Barmpoutis is participating as a co-investigator in this project along with other professors from the UF departments of Neuroscience and Physical Therapy. The goal of this collaborative team, lead by Dr. Emily J. Fox, is the development of a game-type environment to motivate disabled kids to walk. \n \nThe incorporation of interactive games and virtual reality (VR) is an innovative approach for making rehabilitation more engaging. Game technology motivates children, promotes practice, and performance of specific motor skills. Although games have demonstrated therapeutic effects when applied to children with neurological injuries, most games are not designed with consideration to motor impairments or for use in the LT environment. Therefore, the long-term objective in this project is to develop interactive gaming technology for the advancement of locomotor training interventions for children with neurological injuries. This project is the first phase in meeting this objective and will result in the development of a technical game prototype. A collaborative multidisciplinary team has been formed with expertise in basic neuroscience, rehabilitation, computer science, and game development. Focus groups of 8 children with SCI and CP, along with their caregivers and clinicians (Physical Therapists, Physiatrists) will be formed. Feedback from these groups will be incorporated into the team’s development of a game-design document. Using an iterative game development approach, a game software prototype will be developed and optimized for use with LT. Recently- released PrimeSense™ technology that allows for interactive controller-free play will be used and interfaced with the game prototype. Development of this game prototype and pilot data from its use will lead to a competitive NIH grant application. Moreover, the combined application of basic science, rehabilitation, and game-technology has a high likelihood of enhancing walking rehabilitation approaches for children with neurological injuries. \n \n</div>
3D Scanning the Rosetta Stone
Use 1-finger and 2-finger gestures to move, rotate, and zoom the 3D model of the Rosetta Stone below: \n<iframe src=https://research.dwi.ufl.edu/op.n/file/2fzrs3i2cvas964f/embed width=1200px height=600px frameborder=0 scrolling=no></iframe> \n \nWith the permission of the British Museum, an interdisciplinary team from the University of Florida and the University of Leipzig scanned the Rosetta Stone in June 2018 to generate a high-resolution 2D and 3D map of its inscribed surface. In our setup, we used a single DSLR camera (Nikon D3400), which was fixed on a tripod in front of the stone, and calibrated as follows: exposure time = 5 sec., ISO speed = ISO-100, F-stop = f/25, focal length = 135mm, and max aperture = 4.5. To reconstruct the tridimensional inscribed surface using the shape-from-shading method, we controlled the lighting of the stone using a handheld light wand (Ice Light) that served as a 15-inch long light source of 1600 lumen at 5600k color temperature. \n \nWe divided the artifact in 8 regions (4 rows and 2 columns), which were photographed individually at 6000 x 4000 pixel resolution. Each region was photographed in 4 different lighting directions (light from the left, top, right, bottom) by placing the light wand in the corresponding side of the region of interest. This quadri-directional lighting configuration allowed us to capture information related to the local orientation at each point of the surface through the differences of the light reflection observed in the corresponding four photographs. The entire scanning session, including opening the glass case of the artifact, setting up the equipment, digitizing the artifact, and putting everything in its original configuration before the opening of the museum took us 120 min. \n \nDuring this time 32 photographs were taken in total (8 regions x 4 lighting conditions), which were then processed to compose high-resolution 2D and 3D representations of the surface with 0.08141mm sampling frequency, which is equivalent to 312 DPI resolution. The tridimensional details of the inscribed surface were captured in the depth map, which was computed by processing the four corresponding images of the same region of interest illuminated with four different lighting orientations using the method by A. Barmpoutis, E. Bozia, and R. Wagman published in the Journal of Machine Vision and Applications 21(6) in 2010. The depth map contains detailed three-dimensional information of the inscribed surface so that it can be visualized in 3D. The 3D reconstructed surface can be rendered as an interactive 3D model that can be manipulated by the user (move, scale, rotate) and can be inspected under different virtual lighting orientations and shading methods. \n \nFinally, in addition to the 3D reconstruction of the inscribed surface, we used a hand-held laser scanner (Structure Sensor by Occipital) mounted on a tablet computer (iPad Air by Apple) in order to create a 3D model of the entire stone. Although the 3D model generated by this scanner can depict the overall shape of the entire artifact, it does not have enough resolution to capture the fine details of the inscribed surface. Therefore, the 3D reconstructed surface using shape-from-shading is complementary to the laser-scanned 3D model, as both of these forms can co-exist in order to depict different structural details of the artifact. \n \nThe result of this process is a high resolution 3D representation of the Rosetta Stone that is available on-line as an interactive web app and can be accessed through the project's website. \n \nIn November 2019, the project was featured in German news outlets: <a href=https://www.mdr.de/wissen/stein-von-rosette-digital-leipzig-100.html>https://www.mdr.de/wissen/stein-von-rosette-digital-leipzig-100.html</a>
<iframe src=https://www.youtube.com/embed/CPgzP6SXouk?feature=oembed width=1200 height=600 frameborder=0 allowfullscreen=allowfullscreen data-mce-fragment=1><span data-mce-type=bookmark style=display: inline-block; width: 0px; overflow: hidden; line-height: 0; class=mce_SELRES_start></span></iframe> \n \nDuring the COVID-19 pandemic, education has been severely impacted across the globe. \nOngoing class sessions are especially important for K-12 students, many of whom benefit \nespecially from experiential learning that is not typically offered in ad hoc online settings. UF \nDigital Worlds Institute professor Angelos Barmpoutis is working in partnership with the UF \nLiteracy Institute (UFLI) to address this world-wide need with an innovative virtual platform \nsolution. \n \nThe Virtual Word Work Mat (VWWM) is an interactive app designed for literacy instruction, \nbased on the original physical classroom version of the Word Work Mat created by UF doctoral \nstudent Valentina Contesse. Valentina stated, “I never imagined when I was making Word Work \nMats for my first graders, using file folders and Velcro, that it could be transformed into a \ndigital tool that teachers all over the world would use in their classrooms!” \n \nAccording to Facebook analytics, just two days after its release date the VWWM had already \nreached more than 70,000 people, with more than 6,000 engagements and 300 shares. And \nUFLI’s online engagement increased by more than 1000 subscriptions during the same period. \nCreating and offering ready access to the Virtual Word Work Mat during pandemic lockdown \nhas empowered teachers and students continue their literacy instruction as part of their on-line \nlearning activities. Designed to work on tablet and other devices using the either iOS or Android \nplatforms, VWWM provides a simple user interface in which students can manipulate letter and \nphoneme cards with intuitive touch gestures and compose words at home. \n \nHolly Lane, Director of UFLI said, “We’re so excited about the partnership between the UF \nLiteracy Institute and Digital Worlds in our response to the pandemic. Thanks to the \ncommitment and technical expertise of Angelos Barmpoutis, we were able to take some our \ninteractive literacy instruction materials and bring them to life on a virtual platform. The \nresponse from teachers has been overwhelmingly positive. We have many thousands of \nteachers accessing the materials and sharing the links on social media.” \n \nThis excitement is shared by Digital Worlds Director James Oliverio. “One of the great benefits \nof experiential online learning is accessibility across the traditional challenges of demographics, \ngeography, and time zones. This project is an example of the interdisciplinary strengths of the \nUniversity of Florida; faculty stepping up in a time of need to provide tangible benefits from the \nongoing research and development happening across our campus.” \n \nUFLI Director Lane also stated, “Together, UFLI and DW are making a difference for teachers \nand their students. We hope this is just the beginning of our collaboration!” \n \n<strong>Sample Comments from Parents and Teachers:</strong> \n \nI used this for the first time yesterday in my virtual reading lesson with my struggling readers. \nTheir response was priceless. They were so engaged and actively asking me to change the \nletters and make new words” \n \nThis has been amazing for my daughter with a severe Auditory Processing Disorder and \ndysgraphia. Thank you!!! \n \nThis is amazing— thank you! Can’t wait to share with my teachers! \n \nThis is immeasurably helpful!!!! \n \n<strong>Press Release by the UF College of the Arts:</strong> \n \n<a href=https://arts.ufl.edu/in-the-loop/news/digital-worlds-institute-researcher-creates-experiential-online-learning-app/>https://arts.ufl.edu/in-the-loop/news/digital-worlds-institute-researcher-creates-experiential-online-learning-app/</a> \n \n<strong>Links to the App:</strong> \n \nTry the Virtual Word Work Mats below: \n \n<strong>Beginner Word Work Mat</strong> \nDirect link: <a href=https://research.dwi.ufl.edu/op.n/file/cbhd8xmn9i4ctf7i/embed>Beginner Word Work Mat</a> \nDrag and drop the letter cards below: \n<iframe src=https://research.dwi.ufl.edu/op.n/file/cbhd8xmn9i4ctf7i/embed width=1200px height=600px frameborder=0 scrolling=no></iframe> \n \n<strong>Intermediate Word Work Mat</strong> \nDirect link: <a href=https://research.dwi.ufl.edu/op.n/file/gc8nkxns914enc7d/embed>Intermediate Word Work Mat</a>. \nDrag and drop the letter cards below: \n<iframe src=https://research.dwi.ufl.edu/op.n/file/gc8nkxns914enc7d/embed width=1200px height=600px frameborder=0 scrolling=no></iframe>
<iframe src=https://www.youtube.com/embed/DOQcUWMz8xU?feature=oembed width=1200 height=600 frameborder=0 allowfullscreen=allowfullscreen></iframe> \n \nThe purpose of this project is to introduce computer programming in humanities curricula by establishing a correspondence between natural languages and computer languages. The exercises discussed here show how you can transcribe into code: 1) a theatrical play in English (Shakespeare's Romeo and Juliet), 2) the common notions in Ancient Greek from Euclide's Elements Book 1, and 3) calculate the discrepancy between Julian and Gregorian calendar using Pope Gregory's XIII documents in Latin. \n \nThe framework uses a tool that instantly presents computer code into natural language in English, Spanish (Español), German (Deutsch), Italian (Italiano), Greek (Ελληνικά), Turkish (Türkçe) etc. as well as ancient languages such as Ancient Greek (Ἑλληνιστὶ) and Latin (Lingua Latina). \n \nThe scientific content of this presentation can be found in this article: \nBarmpoutis, A., 2018. Learning Programming Languages as Shortcuts to Natural Language Token Replacements. Proceedings of the 18th Koli Calling International Conference on Computing Education Research, pp. 1-10. <a href=https://research.dwi.ufl.edu/people/angelos/page/learning-programming-languages-as-shortcuts-to-natural-language-token-replacements/>Download PDF</a>