Exhibitions, river project, WWCD

A little virtual tour of the online exhibition THIS IS NOT A SHOW, showcasing the work of over 50 working class artists and creatives from the Working Class Creatives Database at Pineapple Black Arts, with a brief look at my work included in the show.
‘𝔗π”₯𝔦𝔰 𝔦𝔰 𝔫𝔬𝔱 π”ž 𝔰π”₯𝔬𝔴. 𝔗π”₯𝔦𝔰 𝔦𝔰 π”ž π”°π”±π”žπ”±π”’π”ͺ𝔒𝔫𝔱 𝔬𝔣 𝔦𝔫𝔱𝔒𝔫𝔱.’
With workingclasscreativesdatabase.co.uk/
⚑️launched on pineappleblack.co.uk/index.php/pbvarts/⚑️

οΏ½The Working Class Creatives Database is a platform highlighting the work of people who are working class, giving a supporting structure to people that are involved in the arts.

οΏ½As of 2020, only 16% of the workforce in creative industries identify as being from working-class backgrounds. By creating a platform for working-class creatives this begins to readdress these issues within the sector through creating a voice for those that are otherwise outsiders.

Special thanks to @pineappleblackarts for giving us a virtual space and @highbrowart for the poster design.

3D scanning with Einscan

river project, UNIT 2, Videos, Weekly Summaries, Work in Progress

I mentioned this new 3D scanning available in the 3D workshop briefly in my weekly summary post, but I thought I should probably do a separate post for it, as it is a really exciting piece of kit that I have learnt to use.

This slideshow requires JavaScript.

The Einscan is a piece of kit and accompanying software for 3D scanning, which Jonathan (tutor) mentioned was now available for use in the 3D workshop during our one to one tutorial. You start by setting up the software and kit, by calibrating it- you put the calibration stand onto the turntable, and follow the instructions on the software- rotating the removable board as shown on screen so that the dots align. You rotate the board three times, and in-between each one the turntable rotates it 360 degrees to calibrate the camera. Jonathan (technician) explained that the scanner builds the 3D model by sending out beams of light across the object- the way the light bends around the object is then captured and is used to build the model. Once it was calibrated we used a glue gun to attach the bone to a clear plastic rod embedded in a small piece of wood, much like when I 3D scanned with the iPad and with the photogrammetry- the rod acting as a support. Most of the bones had to be scanned twice, with the bone moved into a different position and glued before being scanned the second time. Jonathan then showed me how to match the two scans up on the software to produce the finished model. (he showed me this on Thursday last week, which is why I ended up with the mutant bone scan, as I didn’t know how to match the two scans up).

I still have one more bone to scan, and possibly one to redo, but I have made really quick progress with this and I’m really happy with how the scans are coming out! As you can see from the video, the details and texture are being picked up much better than they were with the iPad, although not quite as well as the photogrammetry- however this has a much higher success rate and is much faster, so I think the prints I get from these scans will probably be the final ones I put in the show, which is very exciting! I really want to play with these scans on blender- maybe animating them in some way, and I would also like to play around with scale if I have time- printing them as small as possible, and as large as possible. If I have time for these experiments, and if they go well, I am considering applying for the Selected Showcase at our end of year show- I envision displaying a short animation of the digital 3D models, or perhaps a few still images, alongside some huge and tiny 3D prints of the bones.

To Do: 
– Scan the last bone
– clean the scans up on Meshmixer
– send the scans to print