I expect whoever coated the remains with that red cinnabar stuff died rather early, probably with tooth and hair loss and severe mental issues. Perhaps this fate was expected but given that "mad hatters" were a thing until fairly recently, people can be a bit strange when it comes to dealing with poisons.
The guide notes point out that only the most sacred rituals involved this red mercurial stuff. I'm not surprised. It might be rare but rarer still will be people willing to deploy it unless that fate is considered a good way to go.
That tour is a remarkable use of the technology.
I was wondering about this too: they've found high levels of mercury in the water supply at Maya cities and believe now it contributed to the eventual collapse: https://arstechnica.com/science/2020/06/mercury-and-algal-bl...
This makes me think: what if today's rulers are being poisoned by something making them act like idiots?
I’m a 3D artist that is currently encountering staunch resistance of generating 3D models from drone captured photogrammetry of historically protected sites in Pennsylvania, USA.
I’ve had resistance from the state and county level in pursuing take off and landing permission at historical sites. Communicating my intentions of digital historic preservation with photogrammetry has been a difficult “sell”.
I’m a licensed commercial remote pilot - however I need property owner permission to take off and land. Many sites are in state/county owned property in my area.
Another idea: if you don't already have any formal education in history, you could study for some qualifications in the subject. It would probably do much to reassure landowners that you are not going to harm the sites in any way (although I struggle to think of a way you could do so with a UAV!) In any case, good luck; I'd love to see the models!
Why?
In general we clearly have the technology to capture 4K-8K environments and turn them into very realistic virtual worlds. Is anybody even doing such work? For example capturing a neighborhood in San Francisco (or any city) as it looks in 2024 for historical reference? Seems like that should be a thing.
I've seen high quality environmental scans, even way back in the Silicon Graphics days when they showed an amazing scan of the Sistine Chapel. But it seems to me all such scans wind up in some proprietary player format which was designed by somebody who never played a decent open world game like Fallout 4, Cyberpunk, Battlefield, Red Dead Redemption. I have yet to see a museum environmental scan which gets anywhere near the immersive quality of those games. This is not so much a criticism of such work - it's awsome! - but maybe more of a call to arms for game people to help out the scholars.
Or here's the trailer for the project in Unreal: https://www.youtube.com/watch?v=hlNgpG9X7mc
I have a lot of work keeping up with games, it's true--games are expensive to build and aiming at photorealism art style continually looks dated quickly while stylized graphics, less so. I'm trying to fundraise to build this game currently, but it's a tough sell. Educational games don't do well on Steam, so right now, I'm just distributing through my website as I build. The small income this provides helps me contribute back to the modern Egyptian Egyptologists that are excavating and documenting their own culture though.
The latest Game Science title, Black Myth Wukong, does an awesome job with 3d captures of Chinese monuments and bringing the mythology and history to life.
Unfortunately it's a lot of code writing to support rolling shutter cameras strapped to multicopters, where you capture video with short enough exposure to prevent blur. The 3D recovery has to respect the fact that the rows of the image are taken from different positions and angles, causing this up infiltrate basically the entire pipeline.
And global shutter cameras are barely accessible.
If there's some group with the man power and funding to actually pull this off, please get in touch, I would like to pick back up!
Can you share the technical background you've used for creating the 3D reconstruction? Like software packages, or algorithms used.
Are we looking at the result of packages like OpenSfM here, or COLMAP?
So in the virtual tour, you're seeing 360 imagery from the cameras and a lower resolution version of the 3d capture data, optimized for web. The lower res mesh from the scanner is transparent in first-person view mode so users get cursor effects on top of the 360 image.
For film, PBS sent out a documentary crew, and they wanted me to render some footage of the full tunnel system, so I exported the e57 pointcloud data from Matterport and rendered the clips they needed in Unreal. It should be coming out soon with "In the Americas."
This is amazing. Thank you for sharing.
For Unreal I used a few methods but mostly conventional photogrammetry incorporating the lidar.
Ultimately, I'm hoping that downloading the file or some type of pixel streaming for web until the nerfs or splats -- or whatever follows them -- works out.
Edit: also very nice tool :)!
I'm confused, mate: why and how would 21st-century professional archaeologists avoid using modern powered tools and techniques? That's absurd, dangerous, and not cost-effective.
Did you use the Pro3 as the capture device? Before the collapse anyway!
I mostly use the Pro3 now but did a big chunk of this Georgia Tech scan with the BLK: https://my.matterport.com/show/?m=PB8FgAyyjHx
That's an impressive huge capture!
Is it hard to avoid integrator error in long tunnels?
For me, the Maya have always been important because they're our history and our stories in the Americas (I'm from the US) -- more than the greco-roman mythology I grew up with. They grew corn and love ball games. Their stories are more directly our stories, and their struggles are our struggles.
Not to be political, but they also kind of wiped themselves out by large scale environmental collapse, and the jungle is filled with their undiscovered monuments. There's still so much to learn.
People really geek out on how much the Maya knew about astronomy too -- they shot archaeoastronomy docs twice while I was working on site. Richard Feynman even helped decipher Maya glyphs and writes about it in "Surely you're joking Mr Feynman". He gave a lecture also if the audio file can be checked out somehow: https://collections.archives.caltech.edu/repositories/2/acce...
Did you take any scans after sections collapsed? Would love to hear more about what happened.
I did take some scans after the collapse! After we'd dug ourselves out and crawled out on our bellies, I went back with Polycam. The collapsed section we dug through was comparatively small, maybe 4-5 meters: Section 1: https://poly.cam/capture/4BB863F2-1CC3-46E3-8BDB-232EE3057BD...? (you can see where we crawled out to the intersection here -- the whole intersection had ceiling collapse, but only the section we dug out through was fully covered). Section 2: https://poly.cam/capture/3C5BB7BD-5FC9-4C00-AE1C-84E0544C51C...
We're just lucky it wasn't a rocky ceiling that fell, that would've been much worse.
The team taking care of the tunnels is doing an amazing job with the resources they have, and they're continually backfilling tunnels now and maintaining the ones that are there. It took us about an hour to dig out.
You can compare to the intersection in the matterport version in the same vicinity: https://my.matterport.com/show/?m=r5BR6K6Qxix&ss=338&sr=-.21...
I don't want to editorialize too much, but at that moment we were totally brothers--I was still early with Spanish, and the language, country, age differences fell away, and we dug ourselves out.