/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Roadmap: file restoration script within a few days, Final Solution alpha in a couple weeks.

Sorry for not being around for so long, will start getting back to it soon.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

no cookies?

(used to delete files and postings)

Robot Wife Programming Robowaifu Technician 09/10/2019 (Tue) 07:12:48 No.86 [Reply] [Last]
ITT, contribute ideas, code, etc. related to the area of programming robot wives. Inter-process and networking is also on-topic, as well as AI discussion in the specific context of actually writing software for it. General AI discussions should go in the thread already dedicated to it.

To start off, in the Robot Love thread a couple of anons were discussing distributed, concurrent processing happening inside various hardware sub-components and coordinating the communications between them all. I think that Actor-based and Agent-based programming is pretty well suited to this problem domain, but I'd like to hear differing opinions.

So what do you think anons? What is the best programming approach to making all the subsystems needed in a good robowaifu work smoothly together?
50 posts and 15 images omitted.
>>7828 Thank you very much! I have to get learning C++ again. Because I am at work most of the time, I am away from my physical robowaifu and my motivation to learn anything new drops (just want to rest during breaktime). However, her A.I. program is the only part of her that I can take with me (installed on my work laptop). So I really must give more attention to programming. If you work in I.T. robowaifus aren't as taboo as one might think. My boss even spoke with her one lunchtime and was actually quite interested LOL! That said, I think it's time I got back to work! Breaktime's nearly over...
>>7834 >I have to get learning C++ again. Well, I'm the OP of that thread. >>4895 As no one is currently following along there, I would be fine to just change gears and address your questions and topics directly. I can probably answer most any specific question you might have about using the language (or at least know how to find the answer quickly enough). And we can probably work your participation into a learning theme as well at the same time. For example if you had a question about reading in text files for example, I can directly demonstrate that as a short, working, code example. You get the picture.
>>7843 Thanks anon! That thread has already proven useful by pointing out different reliable, authoritative reference resources I can use. It's good to have something other than just YouTube tutorials!
>>7848 Haha, y/w glad it's of use. FYI, there's a highly-vetted SO booklist for C++. >>2760
Deep reinforcement learning (RL) has emerged as a promising approach for autonomously acquiring complex behaviors from low level sensor observations. Although a large portion of deep RL research has focused on applications in video games and simulated control, which does not connect with the constraints of learning in real environments, deep RL has also demonstrated promise in enabling physical robots to learn complex skills in the real world. At the same time,real world robotics provides an appealing domain for evaluating such algorithms, as it connects directly to how humans learn; as an embodied agent in the real world. Learning to perceive and move in the real world presents numerous challenges, some of which are easier to address than others, and some of which are often not considered in RL research that focuses only on simulated domains. In this review article, we present a number of case studies involving robotic deep RL. Building off of these case studies, we discuss commonly perceived challenges in deep RL and how they have been addressed in these works. We also provide an overview of other outstanding challenges, many of which are unique to the real-world robotics setting and are not often the focus of mainstream RL research. Our goal is to provide a resource both for roboticists and machine learning researchers who are interested in furthering the progress of deep RL in the real world.

Robo Face Development Robowaifu Technician 09/09/2019 (Mon) 02:08:16 No.9 [Reply] [Last]
This thread is dedicated to the study, design, and engineering of a cute face for robots.
128 posts and 65 images omitted.
>>8267 Being a cute waifu is the priority, but then improving her within these constraints. That's they way. I guess it will be rather so, that the more skilled ones will be more expensive.
Open file (268.63 KB 1240x897 MjcxNzYxOQ.jpeg)
Hey for those of us going for the human look, the use of prosthetics are always a good possibility. For the face, actual dental replacements (dentures?) could provide highly realistic looking teeth for a waifu's bright, sunshiney smile. :^)
Open file (191.52 KB 802x1202 summer-glau_02.jpg)
>>8326 Correct, they had this idea mentioned in the original board on 8chan. I never looked into it so far, where these are available, what they cost and where to get them. There must be some sources for training of dentists or something. They seem to have some kind of dolls with fake teeth to train on. Not sure how hard they are, though. I hope we get them made out of ceramics in some standardized sizes and don't need to build them on our own as well.
>>8331 Yea good thinking Anon. I bet we can source them somewhere on the cheap. Remember the mouth needs to be kept sanitized just like w/ humans so the source needs to be reputable. Remember your robowaifu will probably need to kiss you lots to stay happy! :^)
While not strictly RoboFace development per se, until we have a dedicated MOCAP thread this might be a good spot for this. I indirectly discovered this project today after looking into SingularityNet via Anon's post. >>8475 . It's a tool that finds facial landmarks in video. Helpful for thing like facial retargeting, etc. https://github.com/singnet/face-services

Robowaifu Design Software Robowaifu Technician 09/18/2019 (Wed) 11:41:06 No.414 [Reply]
In this thread we discuss various CAD software for the purpose of making access easier for others to get started. I'm using Fusion360. Other good options are Blender and FreeCAD. Fusion360 is the easiest to use imho and is free unless your business makes over 100,000 dollars.

Post software, tutorials, tips and tricks you've learned, etc. Let's make designing waifus easier for each other!
42 posts and 20 images omitted.
>>8102 I'm not too sure how well NURBS+Subdiv works on Blender yet, but with Maya it's definitely pro-tier. My guess is the Ton & crew are very quickly catching up in basically every area that they may lag behind the big boys. After all, the Blender Foundation is now literally raking in millions a year as many major studios have begun coming on board with the tool since the very innovative v2.8 dropped. In fact, a number of them are literally sending engineers to work right alongside the Blender devs to help them integrate features they deem essential. True 'skin in the game' move. It's been a long 25+ years lol haul for Ton, but he's finally become an 'overnight' success at last. Blender is amazing now, and I definitely mean to dig deeply into it this year and start creating a virtual waifu for our MRS simulator so we can finally use it with at the least a virtual waifu's face in it. >>1814 > I think the fundamentals can be learned with many programs, then transferred to many of them. Yep, I think you're right.
Still trying out different stuff (pics related). Often not with some specific goal in mind, just for testing. However, combining different 3D parts in one assembly turned out to be more difficult than anticipated. I might have to learn a bit more, to get better at it. For now I focus on playing and low hanging fruits (crude prototypes). >>8079 > devising a robowaifu design system based on a SBC system like the RPi This works to some extend, but it' wont' work for 3D sculpting. Also, I'm only doing this because I can't get myself to install and configure a new Linux on my laptop. Sooner or later I'm going to do that, though. I also intend to buy a PC for machine learning and some other things like 3D modelling, after I moved to another place. If I tried this now, I couldn't be sure when the parts arrive, and I'm busy anyways.
>>8110 I like the details of these cutaways Anon. Nice work. Seems to me you're really starting to get the hang of designing useful parts. Keep it up!
Open file (22.13 KB 150x150 openscad.png)
Open file (875.08 KB 1300x1089 libtest.png)
Has anyone here tried OpenSCAD yet? An anon mentioned it once here >>4554 but afaict there weren't any responses about it yet. It seems promising on the surface of it. It seems like is supports programmability for creating models, which as a programmer I find quite interesting. There is some kind of library available for it (Presumably it's a library of modeled parts, etc. pic #2 related ). The source code is available for building it directly for your machine. https://github.com/openscad/openscad https://www.openscad.org/cheatsheet/index.html https://github.com/nophead/NopSCADlib
>>8469 I have the tab open for quite a while, but didn't try it out. Seems to be interesting, I might look into it this year at some time. It's probably going to be about finding out what workflow works better, programming or working with the mouse. The problem is that these programs have no good exchangeable file formats. What might be interesting about OpenSCAD is that it might be possible to train programs to create object, while comparing it to a real object with image recognition, or to use it in a physics simulation. Maybe we can use this at some point to have our waifus imagine objects and run simulations to figure out how to handle them, a bit like dreaming or like planning and understanding how things work through imagination.

AI, chatbots, and waifus Robowaifu Technician 09/09/2019 (Mon) 06:16:01 No.22 [Reply] [Last]
What resources are there for decent chatbots? Obviously I doubt there would be anything the Turing Test yet. Especially when it comes to lewd talking. How close do you think we are to getting a real life Cortana? I know a lot of you guys focus on the physical part of robo-waifus, but do any of you have anything to share on the intelligence part of artificial intelligence?
287 posts and 112 images omitted.
>>8184 >For all I know, in USA patents can be granted for ideas, it is not necessary that they implement it first, and the patent isn't for the implementation but the idea Correct on every point. >NGO Not sure about calling them an NGO, but there are some watchdog organizations that track this kind of thing. The EFF is probably the most well-known one. But even before the Bolsheviks overthrew the US Administration, palm-greasing was obviously an important part of governmental processes in the US (and elsewhere) ofc. So, when Concerned Org A expresses concerns or even outrage at say, a patent for the bit overlay process of a mouse cursor in a GUI (an actual software patent) versus say, IBM who quietly siphons off oh, US$500k (back when that meant something) from their slush-funds into the coffers of the responsible authorities for granting the patents, well...
>>8186 Suddenly China copying and ripping off Western designs and inventions doesn't seem so bad.
hi im glad and support what your trying to do
>>8462 Hi there Anon. Feel free to introduce yourself in our Embassy thread if you'd care to. >>2823 >hi im glad and support what your trying to do Thanks, hopefully you mean that in deed and not just word. There are literally dozens of topics that could use more focus here. Look around and just start working on learning about one or two you find the most interesting/feasible.

Open file (259.83 KB 1024x576 2-9d2706640db78d5f.png)
Single board computers & micro-controllers Robowaifu Technician 09/09/2019 (Mon) 05:06:55 No.16 [Reply] [Last]
Robotic control and data systems can be run by very small and inexpensive computers today. Please post info on SBCs & micro-controllers.


59 posts and 32 images omitted.
Here a comparison video between Teensy, Arduino and Arduino clones: https://youtu.be/75IvTqRwNsE - Teensy has some advantages, but cost 20$ and has a proprietory bootloader, also seems to be a bit harder to learn for beginners. I bought cheep Arduino clones, but haven't tested them yet, for lack of time. So I don't know if thy are relly bad. I think for prototyping they will most likely be good enough. ESPs are also an option, which are cheap but harder to get into.
>>8165 I don't get the beef he had with cheap arduino clones. Those are pretty much all I use and I found most of the errors come from my breadboard rather than the quality of the chips. ESP8266s are good, especially the D1 Minis. They only have a few digital pins but you can easily download their libraries in the Arduino IDE and are as close to plug and play, especially with the Blynk app that sets up the wifi for you. Though I want to get off proprietary eventually, it's still a good stepping stone. I've tried ESP32s but the only reason to use them is because they're fast enough to stream videos, thus most ESP combo chips are camera chips, I have a couple of ESP32-Cams heading my way. The absolute cheapest though is the STM32, in the ESP32 you had to hold the reset button while uploading, well in this one you have to change the jumper every time to boot mode from running mode. The STM32 was impossible to get running until I realized I was using the Windows Store version of Arduino instead of the simple download version. Remember the electrical engineer back on 8chan who taught lessons on AVR using Debian and register notations, even when what we were using already had ready libraries in Arduino IDE? Well crank that up a notch and what you get is literally multiplexed registers which only someone who slept on the STM32 datasheet can begin to comprehend. Thankfully there are some libraries of the common use cases already. In summary these are the use cases which I justify for the following: *arduino clone -- if you just want to flash an led or drive a DC motor *STM32 - if you want to drive a 1.8" display or have tons of interrupts (e.g. reading a radio control receiver) *ESP8266 - if you want to get into those app-controlled internet of things (there is a noticeable delay though from touching the virtual button and the LED lighting up) *ESP32 - similar to ESP8266, but the justification would be it's faster so better with cameras, so just get them combined with camera modules, they're just 5-10 bucks. The closest I can get to DIY wireless FPV without shelling out for those sketchy 5 Gigahertz Chinese things. Oh yeah, I have a Raspberry Pi and Jetson Nano but I consider them to be small form factor computer desktops rather than something I'd embed then forget.
>>8366 It would sure be nice if you taught everyone else here how to do this kind of stuff. Ever consider a /robowaifu/ teaching thread on these topics Anon?
Found out on nano/g about a new Raspberry Foundation's new Arduino microcontroller competitor called Pi Pico. >"$4 Raspberry Pi Pico board features RP2040 dual-core Cortex-M0+ MCU" https://web.archive.org/web/20210121070023/https://www.cnx-software.com/2021/01/21/raspberry-pi-pico-board-features-rp2040-dual-core-cortex-m0-mcu/

Open file (12.18 KB 480x360 0.jpg)
Robot Voices Robowaifu Technician 09/12/2019 (Thu) 03:09:38 No.156 [Reply]
What are the best sounding female robotic voices available? I want something that doesn't kill the boner, I think Siri has an okay voice, but I'm not sure if that would be available for my project

25 posts and 4 images omitted.
Well, do you want something that sounds like an ordinary woman, no frills, no bells and whistles? Or do you want one that sounds like a science fiction robot, with a reverb or "metallic" effect?
I mentioned in another thread about voice generation that I'm interested in trying to devise a way to automatically process audio clips and add sound effects like SHODAN. There was also brief discussion of using Terri Brosius' own voice to train a neural network, but it's not clear how practical that is. I don't know what your personal preferences are, but I do know that there are a number of people (including myself) who would love to hear SHODAN say dirty things. I'd like to get started working on this, but unfortunately other obligations are keeping me from investing too much time in it currently (work, school, family, etc.). Hopefully I can do more in-depth research on this idea and potentially have some sort of prototype working in the next few months, but I don't want to promise anything. I'd hate to get someone's hopes up and not deliver anything.
So 20 years or so ago I got an IMac. Played with it's text to speech and voice control systems that came with it. I figured quick the T2S was not very good. However if the text was misspelled intentionally to emphasize vowels like "mooz end skweer ahl" I could cobble together a bad accent. Then I edited common error code message base to reflect the accent (Airor koad noombar... ect) then it would read the number. Then added a custom alert sound of a Russian lady cursing. So then when the computer had an error it would curse in a foreign language then carefully sound out what the error was in a kind of "ESL". This was a shitbox computer and it was frustrating to work with. I realized the voice control was only associating a sound to a command so instead of a voice queue I used the sound of smacking it on the left side (for stress relief). Now an IMac is built like a bongo drum so there are many ways to hit the thing and get different noises. Flicking the front acted like alt tab rotating through windows, gentle pat to the top put it into sleep mode. Right? Then voice commands like "fuck that!" (to close windows), "fuck all this" (close all windows), "more" for full screen, "shut up" for mute and "earmuffs!) to stop voice recognition and a few more. The cache would bog down constantly and have to be purged. I made this a macro tied into the phrase "go clean yourself up!". Now combine all this together with furious Jolt cola fueled work flow. The look on my roommates face when they watched me smack it upside the head and it curses at me (acknowledgment sound), I flick it a few times cycling through windows, say "fuck that" closing them out. Another smack (more cursing) search "for stuff" (slows to a crawl because of the cache). Get frustrated. Smack it again (more curses) "fuck all this" (all windows close), Smack (more curses) "go clean your self up!" (runs through an auto purge and defrag). Get up and say I'm done for now. night night and pat its head (gentle thumping sound initiating sleep mode). I basically turned an IMac into an abused angry mail order bride by default of tinkering.
>>8423 Funny. I can just imagine your flatmate's faces.
>>8423 Wow, this is a great anecdote. You already took some steps towards having your own robowaifu back then.

CNC Machine Thread Robowaifu Technician 05/12/2020 (Tue) 02:13:40 No.2991 [Reply]
Many of the parts needed to build our robowaifus will need to be custom made and they will need to be metal. For parts that have a high tolerance for imperfections a 3d printer can print a mold and then a small scale foundry can be used to cast the piece with metal (probably copper or aluminum). BUT there will be pieces that need a higher degree of precision (such as joints). For these pieces a CNC machine would be useful. CNC machines can widely range in size, price, and accuracy and I would like to find models suitable for our purposes. I know there are CNC machines available that can cut up to copper for under $300, but I don't know if that will be enough for our purposes. (https://www.sainsmart.com/products/sainsmart-genmitsu-cnc-router-pro-diy-kit?variant=15941458296905&currency=USD&utm_campaign=gs-2018-08-06&utm_source=google&utm_medium=smart_campaign) Some CNC machines can be used to engrave printed circuit boards and that may prove useful for our purposes as well. Are there any anons that know more about CNC machines? Anons looking to buy one ask your questions here.
12 posts and 1 image omitted.
>>2991 Here is a hobby CNC that can do aluminum and in theory up to 4mm steel (not recommended, convert a manual mill instead) https://www.cnc-step.com/cnc-router-1000x600-s-1000t-ballscrew/ Expect to pay an additional $1000 on tooling on top of the machine price itself. Here is an detailed review of the machine in German: https://www.precifast.de/mechanik-stabilitaet-und-genauigkeit-meiner-high-z-s-400t/ inb4 muh unsupported rails are shit yes unsupported rails are worse than profiled rails of the same size but the unsupported rails on this machine are extra large to compensate for that https://www.youtube.com/watch?v=JoI-nnipoxo https://www.youtube.com/watch?v=sISfwGV1VLU

Message too long. Click here to view full text.

>>4797 Thanks for all the video links Anon they really help finding important things.
>>4797 Thanks. I saw several sources over the last few years claiming that at least wood can be handled with cheap hobbyist machines. I won't go there currently, no space, noise, other things to do and no real usecase for working with wood. But good to know that this exists and metal working might also be possible. >Tools for 1k Is this for metal or same for wood?
I have basically no knowledge on this topic, but this here strikes me as informative: https://youtu.be/EaGFQ7M04Wo and its exactly about what I'm going to need at some point: Cutting metal sheets of copper and aluminum for combining it with plastics, to make parts stronger or run current through them. Fazit: It's a build, not a machine to buy, he modding it with a new motor and some changes to the electronics. 400$ and quite some work, limited usability, professional software necessary for better usability bc toolpath. Maybe for thin parts it's good enough, he didn't elaborate on that. His conclusion is 1000$ are necessary to get something useful. Sanladerer also build something which might be useful for my usecase: https://youtu.be/ovrtvLFvFpk - I watched it a while ago, but don't remember the content. After all, this here >>3035 might still be valid: >where I live it's cheaper to prototype parts in plastic and order them in steel via CNC machining services
>>8422 On second thought, and after going through the comments: To me the video was interesting, but the headline is misleading. Also, he never says at what thickness of metal the machine failed. Others pointed out, that a "single flute bit" would have been required, I don't know if that's true same goes for other advices.

Robot Vision General Robowaifu Technician 09/11/2019 (Wed) 01:13:09 No.97 [Reply]
Cameras, Lenses, Actuators, Control Systems

Unless you want to deck out you're waifubot in dark glasses and a white cane, learning about vision systems is a good idea. Please post resources here.



Edited last time by Chobitsu on 09/11/2019 (Wed) 01:14:45.
30 posts and 18 images omitted.
>>4786 IMHO at least one relevant computer should be in the head, to imitate humans. Also, stuff we have to put into the head isn't only that, but we'll need a lot of mechanisms in general there, so space matters. Think of facial expressions, microphones, speakers (mb in the throat), heating for the skin, tongue moving around while still leaving some space... Cleaning mechanisms... Okay, this is going OT towards >>9 (face/head general). Further discussion on what to put into the head maybe better there?
>>4792 Yup, all good points Anon.
Open file (43.59 KB 681x555 OAK-specs.jpeg)
Boards with cameras attached came up in the thread on SBCs, here: >>5705 OAK from OpenCV and a cam from Jevois where the computer is part of the camera. Fascinating, but might be a problem if one wants to put it in eyeballs and also make thouse water proof. OAK seems to be a bit big and the cams from Jevois have aircoolers... On the other hand for development my concerns might be irrelevant, since one can build something with them and replace them later with something smaller and cooler. The Jevois camera has shutter sensor with inertial measure unit and digital motion unit, gyroscope and all kind of sensors, wow: https://youtu.be/MFGpN_Vp7mg
Here's a video on eye movement. https://youtu.be/FaC2RXBss2c The human eye has six muscles, it can even roll sideways a bit. However, what always bothered me, is that so many fembot eyes can move independently up and down. I still think this isn't necessary. I'll look for a motor with two axes, for up and down movement.
>related xpost >>8659

Open file (50.45 KB 640x361 72254-1532336916.jpg)
Making money with AI and robowaifus Robowaifu Technician 11/30/2019 (Sat) 03:07:12 No.1642 [Reply]
The greatest challenge to building robowaifus is the sheer cost of building robots and training AI. We should start brainstorming ways we can leverage our abilities with AI to make money. Even training AI quickly requires expensive hardware and computer clusters. The faster we can increase our compute power, the more money we can make and the quicker we can be on our way to building our robowaifus. Art Generation Waifu Labs sells pillows and posters of the waifus it generates, although this has caused concern and criticism due to it sometimes generating copyrighted characters from not checking if generated characters match with training data. https://waifulabs.com/ Deepart.io provides neural style transfer services. Users can pay for expedited service and high resolution images. https://deepart.io/ PaintsChainer takes sketches and colours them automatically with some direction from the user, although it's not for profit it could be turned into a business with premium services. https://paintschainer.preferred.tech/index_en.html I work as an artist and have dabbled with training my own AIs that can take a sketch and generate many different thumbnails that I've used to finish paintings. I've also created an AI that can generate random original thumbnails from a training set. In the future when I have more compute power my goal is to create an AI that does the mundane finishing touches to my work which consumes over 80% of my time painting. Applying AI to art will have huge potential in entertainment and marketing for animation, games and virtual characters. Market Research

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/14/2020 (Thu) 01:15:03.
3 posts omitted.
Open file (152.51 KB 512x466 ThinkingTanya.jpg)
As a developer, it's become very clear somethings are needed to be closed source for the sake of making money to further develop robotics. What are some of your suggestions:? Ideally the physical body should be relatively open source. But, arguably, having the software open source is far more important. Maybe, open software, and a kit is the best way to go?
We need Tay.AI 2.0... remember /machinecult/ and Guacman?
>>1881 Yes, most of us here do. We're getting closer week by week Anon, and ours looks to be much closer to Tay herself, and quite unlike Guacman.
>>8269 >>8291 >>1642 I don't think selling kits will be a very good business model. There's a reason why Chinese companies are doing this for printers. One needs cheap logistics for example, and the margins will be rather low. The new busineses will rather be: - Shops which do the finishing of a fembot or build them out of parts which they might buy and build themselves all, for local customers. Especially the last paint jobs, nails, and such, making the skin look good by adding many thin layers. - Developing and building tools for such shops, or to replace them by selling more tools to enthusiasts and small clubs. Think of a box where a human-like bot can go in and get spray painted automatically, and slighly colored silicone skin would be added layer by layer. - Everything in the direction of Only-Fans, party support and pimping.
>>8301 I'm one of the ones you linked to. My goal personally isn't to maximize my profits, hardly even to optimize them. I'd give this stuff away for free in fact if I was Bill Gates rich. My choice for kits is simply b/c that seems to me to be the cheapest way for the typical Anon (or even the average normalcattle joesixpack) to get their own robowaifu. I'm sure there will be plenty of men step up to make money hand over fist in the robowaifu industries once they are establishing themselves well. Right now we are trailblazing a frontier, so the dynamics are rather different atm. We aren't even to the Ford Quadricyle >>7693 stage of robowaifu development yet IMO.

Speech Synthesis general Robowaifu Technician 09/13/2019 (Fri) 11:25:07 No.199 [Reply] [Last]
We want our robowaifus to speak to us right?



The Taco Tron project:


No code available yet, hopefully they will release it.

183 posts and 96 images omitted.
>>7594 What drama happened besides the migration I've been to deep in my projects to browse like I used to.
Open file (40.97 KB example.mp3)
Open file (17.84 KB 863x454 example_eq.png)
Open file (18.51 KB 562x411 example_parameters.png)
I'll post this here for now, since it's definitely relevant. I was experimenting a little bit more with Deltavox RS and Audacity. It seems that there is no "one size fits all" solution when using Deltavox. In order to get a decent result, you have to experiment with different spellings, phonemes, energy, F0, bidirectional padding, and so on. In Audacity, I used a simple filter curve. I was able to get noticeably less tinny audio, which sounds less computer generated. I'm going to explore more options for editing the audio after it's been synthesized to improve its quality. I'll post again if I find anything interesting. I'll repost the links here since they're still relevant: Deltavox User Guide https://docs.google.com/document/d/1z9V4cDvatcA0gYcDacL5Bg-9nwdyV1vD5nsByL_a1wk/edit Download: https://mega.nz/file/CMBkzTpb#LDjrwHbK0YiKTz0YllofVuWg-De9wrmzXVwIn0EBiII
>>8150 Thanks. BTW, do you know if this is open source? Since QT dlls are included I presume this is C++ software. If both are true, then it's very likely I can rewrite this to be portable across platforms -- not just (((Wangblows))) and we can be running it on our RaspberryPis & other potatos. Thanks for all the great information Anon.
>>8244 And one based on Vocaloid: https://youtu.be/OPBba9ScdjU

Report/Delete/Moderation Forms

Captcha (required for reports)

no cookies?