/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

/film/ has been restored, however files are gone. Sorry about that.

Please use the File Restoration Thread if you have any of the files from /film/ in Julay.


Canary update coming soon.

Max message length: 4096

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


/robowaifu/ Embassy Thread Chobitsu Board owner 05/08/2020 (Fri) 22:48:24 No.2823 [Reply] [Last]
This is the /robowaifu/ embassy thread. It's a place where Anons from all other communities can congregate and talk with us about their groups & interests, and also network with each other and with us. ITT we're all united together under the common banner of building robowaifus as we so desire. Welcome. Since this is the ambassadorial thread of /robowaifu/, we're curious what other communities who know of us are up to. So w/o doxxing yourselves, if this is your first time posting ITT please tell us about your home communities if you wouldn't mind please Anons. What do you like about them, etc? What brought you to /robowaifu/ today? The point here is to create a connection and find common-ground for outreach with each other's communities. Also, if you have any questions or concerns for us please feel free to share them here as well.
Edited last time by Chobitsu on 05/23/2020 (Sat) 23:13:16.
105 posts and 29 images omitted.
>>5211 AI and Chatbots >>22

Welcome to /robowaifu/ Anonymous 09/09/2019 (Mon) 00:33:54 No.3 [Reply]
Why Robowaifu? Most of the world's modern women have failed their men and their societies, feminism is rampant, and men around the world have been looking for a solution. History shows there are cultural and political solutions to this problem, but we believe that technology is the best way forward at present – specifically the technology of robotics. We are technologists, dreamers, hobbyists and geeks looking forward to a day when any man can build the companionship he desires in his own home. Not content to wait for the future, however, we are bringing that day forward. We are creating an active hobbyist scene of builders, programmers, artists and designers, using the technology of today, not tomorrow. Join us! NOTES & FRIENDS > Notes: -This is generally a SFW board, given our engineering focus primarily. On-topic NSFW content is OK, but please spoiler it. -Our bunker is located at: https://anon.cafe/robowaifu/catalog.html Please make note of it. > Friends: -/clang/ - currently at TBA - toaster-love NSFW. Metal clanging noises in the night. -/monster/ - currently at https://smuglo.li/monster/ - bizarre NSFW. Respect the robot. -/tech/ - currently at >>>/tech/ - installing Gentoo Anon? They'll fix you up. -/britfeel/ - currently at https://anon.cafe/britfeel/ - some good lads. Go share a pint! -/server/ - currently at https://anon.cafe/server/ - multi-board board. Eclectic thing of beauty. -/f/ - currently at https://anon.cafe/f/res/4.html#4 - doing flashtech old-school. -/kind/ - currently at https://kind.moe/kind/ - be excellent to each other.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 07/25/2020 (Sat) 19:02:11.

Robot waifu desires Robowaifu Technician 09/18/2019 (Wed) 11:30:37 No.408 [Reply]
What level of robot would you clang robowaifuists? (Thread for providing references and desires to those actively developing robot waifus that can be used sexually.)
11 posts and 7 images omitted.
>>451 Aegis-shaped toasters the best tbh.
I'm really not into /clang/, but I once had one a nice picture of a Robogirl, which I think I've lost, which was really great. Maybe one of you has a pic db where you can find her. She was very humanoid looking, it was a CGI rendered picture, not a drawing, her outer shell was pink but she had darker soft parts around the hips and on other places, like printed rubber parts.
Open file (25.95 KB 976x396 VRS.png)
I'm fine with 8. Besides, we're closing in on some of that Elon Musk Neural link tech so I think we'll be able to jump into VR worlds and really feel things soon (relatively) than later. Also to keep from clogging up the embassy thread. >>5178 I'm not too concerned with deep conversations with an A.I.. I'm more concerned about how good it is at using environmental data as well as user input to augment its responses. That user data being; A user's facial expressions, their tone of voice, and their particular sentence structure and word use. Of course, getting an A.I. to recognize those variables and act on them appropriately is the real trick. I don't need it to understand exactly what I'm saying as long as it can get get the gist of what I'm saying. You could maybe even have preset responses for the user to choose from in a similar fashion to a Visual Novel but it'd be more intuitive if she could understand your natural voice and spontaneous speech. I'd call it Variable Response System or VRS. It could have a number of internal flags for the specific responses to be triggered. They don't have to deep or insightful as long as they have the illusion of matching the feelings of the user empathy wise. For instance, a simple conversation starter might be "How was your day?" If you provide a jovial response with a warm expression that the bot can recognize it will respond in kind. For a system like this, you wouldn't necessarily need any Vocaloid or text to speech nonsense if you're just wanting to have it show emotions. You could have a simple avatar that was capable of showing a range of emotions. For the "voice" you could have different tones that express certain feelings of speech with faster high pitched beeps to show happiness and excitement or slower lower-pitched beeps to show boredom or a melancholy kind of emotion. The most intuitive option I think will probably be the simpler one. Unless you're needing deep conversations for discussions of philosophy or niche subjects there's no need to craft an overly complex dialogue tree or conversational A.I.. Not that it has to be bare-bones either but I think I'd prefer a dumb bot that can pass for smart instead of a smart bot that trips over it's own mental shoelaces and ends up sounding dumb or going on an inorganic and wayward rant that's more off-putting than insightful. tl;dr- I want a digital assistant with some personality that is warm towards me. Even if her expressions are all mechanically controlled with numbers that's good enough to fool my monkey brain if she's behind a cute avatar.
>>5206 Sorry, I don't think I can help you there. Sounds nice though, good luck finding it Anon. >>5212 >8 >"An AI is fine too."
Open file (9.88 KB 292x219 Smars.jpeg)
>>5212 AI and Chatbots still might be the right place for you to look where to get started: >>22 Or pick any Arduino or other electronics projects we have threads for this as well: >>95 Building a ground dwelling mobile robot could be the right start for you. Just one example: https://www.thingiverse.com/thing:2662828 might also get you started with 3d printing. Every topic has it's leaning curve, though. Ask me, I'm trying everything at once

Open file (457.29 KB 1600x1067 yummy_sepplesberries.jpg)
Open file (15.07 KB 306x344 ISO_C++_Logo.png)
Haute Sepplesberry Cuisine TBH Robowaifu Technician 09/04/2020 (Fri) 20:10:50 No.4969 [Reply]
Good morning /robowaifu/. For today's cooking-lesson class, we'll be baking up some delicious Sepplesberry Pies. First we prepare some crispy and light Pi crusts and get them just right, then we'll load them up with tons and tonnes of succulent and Juci Sepplesberrys. We'll also mix in lots of other tasty goodness then pop them into the oven and after a couple hours, voilà! delightful Sepplesberry Pies. >tl;dr ITT we mek C++ dev boxes from RaspberryPi computers >C++ development main thread >>4895 Embedded processors and integrated systems programming naturally go hand-in-hand for /robowaifu/. The RaspberryPi and C++ are natural baseline choices for each of these categories. At this point in time they are both popular concerns with large communities behind them, and each bring objective benefits for us as robowaifu technicians. For the Pis they are quite powerful relatively speaking, and inexpensive as well. For C++ it has great performance and other characteristics when used correctly, with generic abstraction mechanisms second to none. In an attempt to dovetail the two areas we're going to be going through setting up Raspberry Pis as little computers for learning the C++ programming language on. This should help every anon on /robowaifu/ that follows along to be on the same basic page for both embedded and programming. Once we're finished each of you will have your own little development exploration box you can literally carry around in your pocket. It will be self-contained, independent, and won't interfere with your other computing/vidya platforms. It will offer you a convenient way to begin controlling embedded hardware directly on the same machine that you write software for it on. This is a pretty compelling scenario IMO, and should serve us all as a good base from which we can branch out and grow from there. Working with other hardware and software will flow naturally from this project, and will give each of us a common experience from which we can build together and keep moving forward. So let's get started /robowaifu/.
12 posts and 28 images omitted.
>>5165 As mentioned, we'll use dd to write the downloaded image out to the MicroSD disk. https://tldp.org/LDP/abs/html/extmisc.html I'm going to combine unzip and write in one Bash statement, so in my specific case, the command is: unzip -p 2020-08-20-raspios-buster-armhf-full.zip | sudo dd of=/dev/mmcblk0 bs=4M conv=fsync > #1 In a few minutes the image should be written out to the device. > #2 Checking again with GParted, we see you now have a fat32 /boot partition, and an ext4 /root partition. > #3 Because of a setting with the bootloader (in regards to the next step), let's go ahead and move root/ to the end of the drive space. This will take several minutes to finish. > #4 There is an unallocated block of 668MB, so we may as well go ahead and create a swap file there. > #5
Open file (1.20 MB 2560x1600 20200915_031829.jpg)
Open file (798.42 KB 2560x1600 20200915_044816.jpg)
>>5166 Turn swapon. > #1 Unmount the partitions, and eject the MicroSD card from your main box, then insert it into the RPi hardware drive slot (from the bottom side of the board). > #2 Connect the Ethernet cable, HDMI video cable, mouse+keyboard dongle, and then power it up. Once it boots up, click the Cancel button on the initial 'Welcome to Raspberry Pi' dialog that first pops up. We'll deal with updates a little later. > #3
>>5167 Alright that's it for today class. If you've followed along successfully, congrats. You've begun a journey into embedded computing. :^) We'll continue on with the RPi preliminaries soon. As always, if you get stuck or have any questions just ask ITT. Cheers
Open file (71.94 KB 407x481 Workspace 1_069.png)
Open file (53.34 KB 627x469 Workspace 1_070.png)
Open file (133.35 KB 1366x768 Workspace 1_044.png)
Alright, there are a couple of things to get set up on the main box as well so we can work with our SepplesberryPi w/o having to use either a monitor, keyboard, or mouse connected to it. We'll connect it directly to our main box with an Ethernet cable for it's networking. IIRC, I think W*ndows calls this Internet Connection Sharing, but ofc we'll be using Linux instead. So, I use the WiFi for my main box's Internet access. This leaves the Ethernet wired port available. And it's this port we'll set up for the RPi to get out to the network with. You configure the settings on your main box to bridge the two connections (and this will also provide DHCP settings for any incoming clients connecting through the Ethernet port). For the RPi, you don't have to do anything special, it just werks. BTW, YMMV. This setup is workable for my specific situation. You may want to connect yours straight into your router, or across the WiFi, etc. Just make sure you can figure out what it's IP address is afterwards b/c you'll need it for the other step. Then, we'll use VNC to control our RPi remotely. The RPi has the VNC server software already installed, but we'll need to install the viewer software on our main box. Once everything's set up, you'll no longer need to use those other peripherals to work on your RPi dev box. Go ahead and plug the two boxes together with the Ethernet cable in between. On your main machine, choose 'Advanced Network Configuration'. > #1 Select the Ethernet connection, then click the little gear for 'Edit the selected connection'. > #2 Choose IPv4 Settings > Method > Shared to other computers then save.

Message too long. Click here to view full text.

>>5214 On the RPi, we see scrot is already installed by default, and that we have only 408 MB free on the drive. We'll free some space up before we're done. Let's go ahead and set up VNC then we'll no longer need the physical keyboard+mouse, nor the HDMI monitor attached to our RPi hardware. Select [RPi Menu] > Preferences > Raspberry Pi Configuration > #1 Enable SSH & VNC. > #2 Now check your IP address under the VNC Server applet, you'll need this address for the viewer on your main box. > #3

Robowaifu Design Software Robowaifu Technician 09/18/2019 (Wed) 11:41:06 No.414 [Reply]
In this thread we discuss various CAD software for the purpose of making access easier for others to get started. I'm using Fusion360. Other good options are Blender and FreeCAD. Fusion360 is the easiest to use imho and is free unless your business makes over 100,000 dollars.

Post software, tutorials, tips and tricks you've learned, etc. Let's make designing waifus easier for each other!
32 posts and 13 images omitted.
>>4795 Hope you keep us updated Anon. >do these work OK on a SBC like that?
Open file (1.37 MB 500x250 migi.gif)
>>4802 I'm using Meshlab for repairs and filters, Wings3D for some extrusions and creating new simple meshmodels, though in the future I might use Solvespace for that, Prusa Slicer for cutting mesh models and adding parts for support (I don't mean the regular auto- generated supports, but manually added ones). I tried Blender briefly, though it's really slow with a complex mesh model on a Raspi3. I didn't even try to use Make Humans yet. Misfit Models 3d works well for some things, it can open dxf files but not stl, while Wings3D doesn't accept dxf but stl, and Solvespace takes only dxf or its own file format. Meshlab is very responsive and has a lot of filters, but the file I repaired with it still had errors in Prusa Slicer and caused a faulty print, while it didn't make it easier to cut it. Cutting whole models into parts in Prusa Slicer causes severe errors (deformations) in that model most of times, which then need to be repaired manually, taking hours (for a beginner)... By now I'm completely opposed to the workflow of mixing CAD designed exact parts with mesh models, like the developer of Sophie does, but also the developer of InMoov seems to have done. This kind of mixing needs to be kept to a minimum for good design.
Open file (53.69 KB 683x535 Fusion-Upgrade.png)
Open file (577.29 KB 2400x1600 Stallman.jpeg)
Before I forget it again: Congratulations to your progress in learning Fusion360, especially the more advanced options. Autodesk thanks you a lot for enjoying their product. Autodesk is so happy for everyone, they just announced that Fusion360 won't have so many options in the free version anymore. Now you can support them with a regular monthly payment or once a year, and therefore contribute to the further development of their product, and so help them to create even better and more expensive options. You can even get some bonuses here and there, maybe. So you won't need to spend to much. Hmmm.. Okay?
>>5194 kek. Yeah Autodesk is rather mercenary in their general behavior. I've been dealing with them for years, no surprise tbh.
>>5194 Don't forget to export your files in every format they allow right now, you might need these files later. Not sure if dxf works everywhere, and how well. Every progamm has its own format, and it seems to be difficult to write code for import.

Hand Development Robowaifu Technician 07/28/2020 (Tue) 04:43:19 No.4577 [Reply]
Since we have no thread for hands, I'm now opening one. Aside the AI, it might be the most difficult thing to archive. For now, we could at least collect and discuss some ideas about it. There's Will Cogleys channel: https://www.youtube.com/c/WillCogley - he's on his way to build a motor driven biomimetic hand. It's for humans eventually, so not much space for sensors right now, which can't be wired to humans anyways. He knows a lot about hands and we might be able to learn from it, and build something (even much smaller) for our waifus. Redesign: https://youtu.be/-zqZ-izx-7w More: https://youtu.be/3pmj-ESVuoU Finger prototype: https://youtu.be/MxbX9iKGd6w CMC joint: https://youtu.be/DqGq5mnd_n4 I think the thread about sensoric skin >>242 is closely related to this topic, because it will be difficult to build a hand which also has good sensory input. We'll have to come up with some very small GelSight-like sensors. F3 hand (pneumatic) https://youtu.be/JPTnVLJH4SY https://youtu.be/j_8Pvzj-HdQ Festo hand (pneumatic) https://youtu.be/5e0F14IRxVc Thread >>417 is about Prosthetics, especially Open Prosthetics. This can be relevant to some degree. However, the constraints are different. We might have more space in the forearms, but we want marvelous sensors in the hands and have to connect them to the body.

Message too long. Click here to view full text.

33 posts and 9 images omitted.
>>5045 > Cost estimate No, or I haven't seen it yet. Costs might not be the problem, if it's already finished. But for development it might be relevant, especially considering possible additional shipping costs and the delay.
>>5043 >Also, with some little changes it might not be necessary, and resin printers might be good enough, or even FDM and standard parts which we can buy. Sounds interesting. Have anything specific in mind, Anon?
>>5047 No, or I don't remember something specific, but we can print small with FDM: https://youtu.be/gN7QMhBzd4E https://youtu.be/LHg9phNSCEY
>>5048 >0.15mm wow i didn't know there were even nozzles that small, looks almost like resin printing. looks like it takes some mods needed to make it work.

AI + Brain/Computer Interface news & commentary Robowaifu Technician 09/15/2019 (Sun) 10:35:53 No.253 [Reply]
DARPA Wants Brain Implants That Record From 1 Million Neurons

spectrum.ieee.org/the-human-os/biomedical/devices/darpa-wants-brain-implants-that-record-from-1-million-neurons
16 posts and 4 images omitted.
>>5184 Sounds to me like fantasizing what could be one day, because it's easier than trying to build stuff with what we have and can access right now. Why not watch some Isaac Arthur videos and speculate about future sky palaces on Venus, where we can life with our waifus and form a new civilization... I'm speaking out of experience: Dreaming is very nice, but getting things done is more important.
Open file (806.47 KB 918x710 aerith_face.png)
>>5185 I agree that practical advances are important. That's why I'm building one, after all. But just look at the quality of 3D animation nowadays. Holy fuckin' shit how far it's come! It's already better than reality if you ask me! For example, Aerith or Tifa from the recent FF7 remake? Combine a 3d animated model of this quality with even half-decent A.I. and that's waifu for laifu material right there!
>>5199 I guess what I'm thinking of would be a souped-up version of that PS4 VR game 'Summer Lesson', but maybe more Augmented Reality with IoT integrated so she appears to be cooking or cleaning when in fact it's the appliances running IRL and the animations are synced over the top of that. Sure, you'd still have to provide the correct ingredients for cooking in barcoded containers and go to pre-designated interaction points in your room/house but I think that would still be pretty awesome. Especially if you could take her outside with you on your mobile phone, too.
>>5200 >the appliances running IRL and the animations are synced over the top of that. I'm not too sure if the Visual Waifu got all the old archives synced across or not, but iirc this notion was discussed a bit there. Could be related, so might be of some interest. >>240
>>5185 >Dreaming is very nice, but getting things done is more important. I think we're all trying to create 'practical' things here Anon, even if they're virtual. Otherwise, we'd all be hanging out on /clang/ wherever that might be today, rip.

Visual Waifus Robowaifu Technician 09/15/2019 (Sun) 06:40:42 No.240 [Reply] [Last]
Thoughts on waifus which remain 2D but have their own dedicated hardware. This is more on the artistry side though ai is still involved. An example of an actual waifu product being the Gatebox.
gatebox.ai/sp/

My favorite example is Ritsu, she's a cute ai from assassination classroom who's body is a giant screen on wheels.
133 posts and 71 images omitted.
>>4232 I like the sound of those ideas Anon. Can you expand them with some specific details for us?
It's chinese software but the movements - the mocap and the cloth physics are so fluid: https://www.youtube.com/c/LumiN0vaDesktop/videos Seems to be in beta as its just prerendered sequences, no contextual interactivity yet.
>>4235 AIML or other chat systems store sentences and logic when to use them. Those are called by some software (runtime?). One could write software which would create responses on it's own, using other software like NLP, GPT, ... There would be more time to analyze the grammar and logic, compared to doing that only when needed. Humans also think about what they would say in certain situations ahead of time, have inner monologues, etc
>>4829 I think the idea of 'pre-rendering' responses (so to speak) might have some strong merit. Particularly if we could isolate common channels most robowaifus would go down in everyday scenarios, then there might be some efficiencies in runtime performance to be gained there.
>>4028 >best Gravity Falls episode kek

Work on my Elfdroid Sophie Robowaifu Enthusiast 08/18/2020 (Tue) 22:49:13 No.4787 [Reply]
Design and 3D printing is under currently underway to turn Sophie from an articulated doll into a proper robowaifu. I will post updates to design files on her Google Drive folder when I have confirmed that everything actually works smoothly. So far I've just got her eyes moving left and right, her lower jaw can open and close, and I am working on giving her neck two degrees of freedom.
27 posts and 7 images omitted.
>>5175 Thank you. > Long printing time I only print sized down models yet, and even then sometimes with a bit higher layer hight. I'm trying to redesign some of the models to make them easier to print. I'm glad you do what you're doing, but I wish more people would look into it and work on the same files. Or work on other ones and come up with a good workflow. Are we the only ones with a printer here? I seem to be the only other one printing your files.
>>5179 Glad to hear you are also having a go. TBH when I upload parts to Thingiverse or MyMiniFactory, they usually end up being used as some kind of articulated GoPro camera mount or borrowed for some drama project completely unrelated to Robowaifus. But then, people who click on the links to the STLs will still see where they originated, thus increasing exposure of people to the idea of the 3D printed robowaifu.
Open file (807.93 KB 2354x2110 IMG_20200917_193931~2.jpg)
>>5175 >0.15 mm layer hight Why?!? In all you pics you were putting something on her face anyways. Post-processing might be the better way anyways: >>5137
>>5186 In Cura 0.15mm layer height is classed as "normal". Also, I really dislike the ridges that come with coarse settings. I found that they tend to show through the paint, especially in certain lighting situations. I still have cleaning up to do (particularly her forehead seam, which I'm not happy with yet) but I wanted to work out more electronics and do the meat and potatoes of getting my servos working first. The first set of servos I had were for a Tinkerkit Braccio arm, and Arduino seems to have programmed them so that they are only compatible with the Braccio Shield, not interchangeable with other devices. This meant that they would always jerk violently to a "safety position" whenever I powered them on, so as you can imagine programming was a nightmare. Sophie's head used to fall apart very frequently. To overcome this unexpected flaw, I purchased some generic 25kg.cm servos, an Elegoo MEGA2560 and a Pololu Maestro servo control board. Now I am back in business.
>>5197 Not that anon, but that sounds great! Thanks for the updates.

ROBOWAIFU U Robowaifu Technician 09/15/2019 (Sun) 05:52:02 No.235 [Reply] [Last]
In this thread post links to books, videos, MOOCs, tutorials, forums, and general learning resources about creating robots (particularly humanoid robots), writing AI or other robotics related software, design or art software, electronics, makerspace training stuff or just about anything that's specifically an educational resource and also useful for anons learning how to build their own robowaifus. >tl;dr ITT we mek /robowaifu/ school.
Edited last time by Chobitsu on 05/11/2020 (Mon) 21:31:04.
60 posts and 34 images omitted.
Here is a learning plan for getting into Deep Learning, which got some appreciation on Reddit: https://github.com/Emmanuel1118/Learn-ML-Basics - It also includes infos about which math basics are needed first. However, I also want to point out that DL seems not to be the best option in every case. Other ML approaches like Boosting might be better for our use cases: https://youtu.be/MIPkK5ZAsms
Open file (229.05 KB 500x572 LeonardoDrawing.jpg)
'''Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs''' According to Alexander Stepanov (in the forward to The Boost Graph Library, 2001) This man John Backus and this Turing Award lecture paper were inspirational to the design of the STL for C++. The STL underpins the current state of the art in generic programming that must be both highly expressive+composable, but must also perform very fast as well. Therefore, indirectly so does John Backus’s FP system and for that we can be grateful.
Open file (24.66 KB 192x358 Backus.jpg)
>>5191 Backus also invented FORTRAN (back when that was a first of it's kind for programming portability), and is one of the smartest men ever in the entire history of computing. https://ethw.org/John_Backus

Report/Delete/Moderation Forms
Delete
Report

Captcha (required for reports)

no cookies?