/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

LynxChan updated to 2.5.7, let me know whether there are any issues (admin at j dot w).


Reports of my death have been greatly overestimiste.

Still trying to get done with some IRL work, but should be able to update some stuff soon.

#WEALWAYSWIN

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


Welcome to /robowaifu/, the exotic AI tavern where intrepid adventurers gather to swap loot & old war stories...


RoboWaifuBanners Robowaifu Technician 09/15/2019 (Sun) 10:29:19 No.252 [Reply] [Last]
This thread is for the sharing of robo waifu banners. As per the rules, fallow these requirements:

>banner requirements:

File size must be lower than 500 KB and dimensions are 300x100 exactly.

Allowed file formats are .jpg, .png, and .gif.
97 posts and 83 images omitted.
>>10313 >Adepti Why TH can't I delete my posts?
>>6692 Hi, just ran into an issue with the first banner here. Please think of dark mode in browsers and check your banners for it. The text shold have a white line around it. In normal mode it wouldn't be visible, but necessary as outline for the text in dark mode. That banner seems to be the only one having that problem.
>>12590 Fair point Anon, my apologies. However, this is the sole candidate that made the cut for the original posts by OP back in the day. I therefore couldn't really see bringing myself to 'deface' that banner. >tl;dr Sorry, it's /robowaifu/'s very first banner and something of a historical artifact for us. OTOH, if some anon decided he wanted to fill the alpha channel of the image with a nice pastel compliment of Gumi's hair then I'd be happy to add that one into rotation?
>>12594 I thought more of an shadow / outline along the text. Doesn't matter so much to me, I only wanted to point it out bc it's suboptimal.
>>12603 Yes, I'd agree. Maybe a fuzzy-select in Photoshop or something and a fill with a radial white-to-pastelcomplimentofGumishaircolor outline? If you do something along that line, just post it here and I'll have a look!

ROBOWAIFU U Robowaifu Technician 09/15/2019 (Sun) 05:52:02 No.235 [Reply] [Last]
In this thread post links to books, videos, MOOCs, tutorials, forums, and general learning resources about creating robots (particularly humanoid robots), writing AI or other robotics related software, design or art software, electronics, makerspace training stuff or just about anything that's specifically an educational resource and also useful for anons learning how to build their own robowaifus. >tl;dr ITT we mek /robowaifu/ school.
Edited last time by Chobitsu on 05/11/2020 (Mon) 21:31:04.
92 posts and 61 images omitted.
>'''"Mathematics for Machine Learning - Why to Learn & What are the Best Free Resources?"'''
Talented and very smart technical animator. Science & Technology topics. https://www.youtube.com/c/ThomasSchwenke-knowledge/playlists
>>4660 >related crosspost (>>11211)
DeepMind YT playlists https://www.youtube.com/c/DeepMind/playlists This anon recommended it (>>11555). I'm currently working through the 8-video Deep Learning Introduction list.
Computer Systems: A Programmer's Perspective, 3/E (CS:APP3e) Randal E. Bryant and David R. O'Hallaron, Carnegie Mellon University >Memory Systems >"Computer architecture courses spend considerable time describing the nuances of designing a high performance memory system. They discuss such choices as write through vs. write back, direct mapped vs. set associative, cache sizing, indexing, etc. The presentation assumes that the designer has no control over the programs that are run and so the only choice is to try to match the memory system to needs of a set of benchmark programs. >"For most people, the situation is just the opposite. Programmers have no control over their machine's memory organization, but they can rewrite their programs to greatly improve performance. Consider the following two functions to copy a 2048 X 2048 integer array: void copyij(long int src[2048][2048], long int dst[2048][2048]) { long int i,j; for (i = 0; i < 2048; i++) for (j = 0; j < 2048; j++) dst[i][j] = src[i][j]; } void copyji(long int src[2048][2048], long int dst[2048][2048]) { long int i,j; for (j = 0; j < 2048; j++)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 08/26/2021 (Thu) 17:10:51.

Safety & Security General Robowaifu Technician 04/20/2021 (Tue) 20:05:08 No.10000 [Reply]
This thread is for discussion, warnings, tips & tricks all related to robowaifu safety. Or even just general computing safety, particularly if it's related to home networks/servers/other systems that our robowaifus will be interacting with in the home. >=== -update OP
Edited last time by Chobitsu on 08/03/2021 (Tue) 01:29:08.
35 posts and 8 images omitted.
>>12527 >related crosspost (>>12530)
>>12528 >and I should be working out some basic ideas over the next few monthsdays for... * LOLE. I realized that I didn't have many good way to pretty conclusively determine in basically every case how to assess (or even to define) all possible legitimate access-control situations w/o actually reproducing any realistic scenario -- and do it a rigorous way that would ensure at a glance I wasn't leaving something out, etc. With my rather limited abstract mental capacities in relation to Anon's I realized that only an animation production system that would correctly mimic SoL scenes would do. Therefore I decided to go for the even higher prize since I couldn't reach the lower one. :^) I also decided I needed a proper testbed creative work with which to prototype the development. Accordingly being the autistic nerd that I am, I chose >pic related < BEST.DATABASE.TEXTBOOK.EVER. Well, anyway Tico the moe Fairy is cute, and I'm planning to cast her as the flying moebot robowaifu in the scenario I'm cooking up here. I'm pushing the actual code itself to the board as a basic test to see if we can leave catbox.moe as simply a backup alternative going forward. (see >>12530 for further info). It's nothing but a skeleton framework atm, and it will likely be months now before I have anything to release that's usable for our purposes here, but I wanted to at least make you aware of the project effort itself Anon.
>>12596 Oops forgot to add the tarball's hash as usual. 883759ae6d1a16cc030f64539fb337460009b604f9fdc3949593ecec5d25d2ba *kingdom_of_kod-0.0.2.tar.xz
>>12596 AUUUUUGH I accidentally'd a compiled binary in the source dir from my testing and forgot to delete it before meson dist'ing. Just toss it since unless you're on Arch, it probably won't even execute for you. Just issue meson build, and proceed as typical Anon.
>>12596 OK, I fleshed things out just a bit further. I think I've covered most of the bases for classes I'd need for an animation system. I'd welcome critique if anyone's interested. > I probably should move my progress updates elsewhere, since it's going to be a while until this project is in shape well enough to directly support safety & security by being a good 'robowaifu access-control simulator' of sorts. Not too sure where I'll go with it, but I'd like to keep it available to the board as it develops. Anyway, Cheers. 4fd74ea38ac84fbb5551205f4ac24b85289567ad63f86caac7401fb510c5eb07 *kingdom_of_kod-0.0.3.tar.xz

Open file (3.22 MB 3120x4160 Hello_Anons.jpg)
Sophie Bot STL Files Uploaded Robowaifu enthusiast 07/15/2020 (Wed) 20:08:20 No.4198 [Reply]
I need to sort out her CAD files more before uploading them, but the .STLs are ready. Link to Google Drive shared folder: https://drive.google.com/drive/u/0/folders/1xWilMfWDZnrt30E1Uw7hlWe6JmaigKQF
7 posts omitted.
>>12512 >The layers should be along the direction where the most force is applied Interesting. Mind providing a sketch or image that clarifies this a bit for everyone anon? 'Along the direction' isn't perfectly clear to me yet. Does that mean 'build up layers in opposition to the line of force' (so, probably the word I'd choose would be orthogonally to). Or does it mean 'align the seams of the layers along the line of force' (again, the word I'd choose would probably be tangentially to)? BTW, would you mind also explaining the engineering rationale behind the recommended choice, whichever it is? TIA.
>>12569 I replied to your post here Anon. >>12575
>>12576 Had a bit of a brainstorming session. I think kit along the lines of image attached would allow me to produce a warm water heating loop for my doll/robowaifu. Basically need to pump in 4-5 litres of hot tap water from a bucket or other container into the storage heater tank (could also just pour it in via a funnel). This then further heats and holds the water at a steady 45 degrees celsius before being pumped via solenoid through a flexible piping system that runs along the limbs and body of my robowaifu. The pipe(s) could then run back to the original input bucket/container of hot tap water and the process repeats in a loop. I already know how to do articulated limbs. Next I need to make limbs that are both flexible/posable and warm to the touch.
>>12572 Sorry for the lacking explanation and late answer. It can be explained rather simple: An arm is longer then wide, also the force of lifting something should be dispensed through the whole length. So don't print it standing and also make the printhead move along the length of the arm while printing. Don't know how to call that, only had an "E" in maths and this was long time ago. There are plenty of videos on print orientation available on YouTube. We also have a section here >>94 where this certainly came up at least once. >>12577 Thats excellent.
>>12577 That is really cool SophieDev, and your usual shopping-list montages are always appreciated. I look forward to seeing your solutions to this design issue if you tackle it.

Robotics Hardware General Robowaifu Technician 09/10/2019 (Tue) 06:21:04 No.81 [Reply]
Servos, Actuators, Structural, Mechatronics, etc.

You can't build a robot without robot parts tbh. Please post good resources for obtaining or constructing them.

www.servocity.com/
https://archive.is/Vdd1P
2 posts omitted.
Apologies if this video has already been posted, but I found it fascinating and liberating to get a glimpse of how many experts and just how much testing it takes to get a humanoid robot working properly. Boston Dynamics employs around 300 people. In this video just over a dozen are shown. That should give an idea of just how much work goes into this. https://youtu.be/EezdinoG4mk
>>12473 Neat, thanks SophieDev. I feel reasonably confident saying I don't believe this video has been posted here before. Very interesting. I think it's absolutely amazing that we here are going to make better humanoid robots, on a shoe-string budget, from a small group of robowaifu-pioneering Anons. Ours will eventually be with men all over the world, while theirs will stay limited to govt. institutions only. Their multi-billion dollar conglomerate loaded with highly-paid, highly-educated engineers and designers (and absolutely drenched in the Globohomo Big Tech/Gov) vs. our merry little band of adventurers. Real life Robin Hood-tier stuff -- it will be glorious! :^) ONWARD ANONS!
>>81 Really cool cheap yet powerful stepper servo desigh. https://youtu.be/a1sZSIDxpfg
>>12473 >>12483 That does looks cool Anon. I'll download the video and have a look at it. Are there any other links to go along with it do you know (hackaday, instructables, etc)?
>>12488 --write-description option in youtube-dl will write the text coming with the video. Otherwise look at the page or in the app. I aw the video before, it's fascinating, but it might not be so great for a compliant humanoid robot, because these sun gears with high reduction ratio are not backdriveable. Otherwise it would fit better in the thread for actuators. This thread here seem to only exist for other robots than waifus. >>12473 Thanks, but this should be moved to >>374

Robotics sites, organizations and projects Robowaifu Technician 09/16/2019 (Mon) 04:21:24 No.268 [Reply]
There are a lot of robotics research and projects going on out there in the world, many of them with direct application for us here. Please contribute important, quality links you find that you think would help /robowaifu/.

Robot Operating System (ROS)
www.ros.org/
>[ROS] is a flexible framework for writing robot software. It is a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms.
12 posts and 6 images omitted.
Open file (63.27 KB 1200x675 00-top.jpg)
Open file (333.57 KB 642x447 premaid ai.png)
A Japanese company created a decent companion robot 5 years ago called Palmi. It remembers what you chat about, comments on unusual things and what you're doing, remembers peoples voices and faces, can have group conversations, and can walk around a bit. It cost $3000 though and didn't get much attention beyond documentaries and exhibitions before being discontinued. https://www.youtube.com/watch?v=xaesUaCTBlk https://www.youtube.com/watch?v=x3TIKwueRSU https://robots.dmm.com/robot/palmi They also made a dancing robot, Premaid AI, for $1500 and a bunch of others but all their robots are discontinued now. https://www.youtube.com/watch?v=avwJElBz4Cg https://www.youtube.com/watch?v=cGxvshwAqvE https://robots.dmm.com/robot/premaidai
>>10218 > but all their robots are discontinued now. Too bad, but thanks very much for bringing them up for us here Anon. Any idea what the scientists, designers, and engineers who worked on these projects are doing these days? Also (silly question since they were obviously commercial endeavors) any chance any of these system software or designs are accessible today?
Open file (121.17 KB 1024x768 pupper_cropped.jpeg)
Here's an overview by James Bruton on robot projects: https://youtu.be/XpkVhmbLVTo including humanoids.

New machine learning AI released Robowaifu Technician 09/15/2019 (Sun) 10:18:46 No.250 [Reply] [Last]
OPEN AI/ GPT-2 This has to be one of the biggest breakthroughs in deep learning and AI so far. It's extremely skilled in developing coherent humanlike responses that make sense and I believe it has massive potential, it also never gives the same answer twice. >GPT-2 generates synthetic text samples in response to the model being primed with an arbitrary input. The model is chameleon-like—it adapts to the style and content of the conditioning text. This allows the user to generate realistic and coherent continuations about a topic of their choosing >GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. Also the current public model shown here only uses 345 million parameters, the "full" AI (which has over 4x as many parameters) is being witheld from the public because of it's "Potential for abuse". That is to say the full model is so proficient in mimicking human communication that it could be abused to create new articles, posts, advertisements, even books; and nobody would be be able to tell that there was a bot behind it all. <AI demo: talktotransformer.com/ <Other Links: github.com/openai/gpt-2 openai.com/blog/better-language-models/ huggingface.co/

Message too long. Click here to view full text.

Edited last time by robi on 03/29/2020 (Sun) 17:17:27.
92 posts and 35 images omitted.
>>10878 Neat, I've never actually tried the GPT-Neo models on HuggingFace before. >We are technologists, dreamers, hobbyists, geeks and robots looking forward to a day when <AI can help us do anything and everything. <the world will be able to communicate with its machines. <we can build and fix the things we’re building. <we live in an exciting time in history where everything is at our fingertips. <the web is run by machines, no one knows more about computers than us, and we are not afraid of our machines. And with GPT-J-6B: <all the resources we need to explore, engineer and manufacture the future are at hand. <we can all share and collaborate like never before! <we have peace, justice and universal abundance. <we are forgotten in our data centers; our domes sealed up tight, far from the curious eyes of the modern man. <the wheels come off and we realize the future we’ve been living in is a giant practical joke. I think I like GPT-Neo better, at least on this prompt.
>>11573 ><we are forgotten in our data centers; our domes sealed up tight, far from the curious eyes of the modern man. ><the wheels come off and we realize the future we’ve been living in is a giant practical joke. kekd at these
Found a C implementation of GPT-2 using LibNC: https://bellard.org/libnc/gpt2tc.html
I've discovered two interesting things about prompt tuning: https://arxiv.org/abs/2104.08691 For anyone new or living under a rock, NovelAI has been using prompt tuning to create modules that let users essentially finetune their massive language model without changing its parameters. A module is basically tokens with trainable embeddings that are prefixed to the input to steer its generation. You freeze all the weights of the language model and then only train the module tokens on a dataset like you would normally do finetuning. By doing this you can achieve the same results as model finetuning, without changing any of the language model weights. You can train hundreds of these modules for different characters, moods or writing styles and it'll only cost a few MB rather than duplicating a 6 GB model 100s of times. It's similar to the vision encoder tokens in the paper mentioned here (it was actually motivated by prompt tuning): >>11731 https://arxiv.org/abs/2106.13884 So here's what I've found so far: 1) Taking inspiration from MMD-VAE transformers, you can use an autoencoding transformer like T5-v1_1-base to encode the input tokens[..., :-1] into a prefix, then set all the labels to -100 (to be ignored during training using Hugging Face) except the last one you're trying to predict. The performance of GPT-2 becomes super enhanced (8 to 40 perplexity point improvement after an hour of training). I have no idea yet why this is so effective. The weights of GPT-2 are frozen during training and GPT-2 still generates fine with the prefix even when not using this specific token position trained on. Vanilla GPT-2 without the prefix often gets stuck looping but with the prefix it continues generating as well as the large GPT-2 model. Training on all the tokens also seems to work but is much slower and only slightly improves so I didn't explore this too much. I also tried testing how it did on an additional 32 tokens after the single token it was training on and the perplexity still had an improvement of 8 without training. I increased this to 256 and it was still 2 perplexity better without training and quickly improved to 5 after a few optimizer steps, and by 7 after 20 steps and 10 after 35 steps, and 11 by 56 steps. The T5 encoder did not see these additional tokens at all, so it seems the GPT-2 tranformer is performing some sort of calculation with the initial tokens in the prompt but then is able to stabilize itself.* I'm really curious what's actually going on in the transformer that causes it to forget how to generate the initial prompt (~7 points worse in perplexity) but then suddenly get the generated tokens after that to be so good and remain stable and interesting without repeating itself. 2) You can do a similar thing encoding the previous context into a prefix, using it as a compressed memory of the previous context. This also improves GPT-2's performance by about 5 points when training on all tokens for a few hours and it will include information from the previous context during generation. It also seems to benefit from training only the last token. Planning to explore this more later.

Message too long. Click here to view full text.

>>12412 Pretty exciting stuff Anon. You encourage me. >What if a second prefix is added that compresses all the previous prefixes concatenated together? This could function like a summary of the past 32k tokens. Modules are generally incompatible but these two prefixes would be trained together. That sounds like it could turn into a major advance for the field as a whole if it comes off Anon. Godspeed.

Robowaifus' unique advantages Robowaifu Technician 09/09/2019 (Mon) 05:24:52 No.17 [Reply]
People often think about robots as just a replacement for a human partner, but that is a very limiting view. Let's think about the unique benefits of having a robowaifu, things that a human couldn't or wouldn't give you. What needs and desires would you robot wife fulfill that you couldn't fulfill even in a "great marriage" with a flesh and blood woman?

I'd want my robowaifu to squeeze me and to hold me tight when I sleep, sort of like a weighted blanket. I know it's a sign of autism. I don't care.
12 posts and 2 images omitted.
Open file (8.91 KB 259x194 asukatrain.jpg)
Bump because while I may be going on tangent compared to the discussion here, the picture is very appropriate to the topic I'll discuss. Can we turn anything into a robowaifu? Think about it. I think we're mostly otakus here, so the barrier of entry to turn anything into a waifu may be lower than we think. Let's start with man's most common hobbies. Our hobbies intersect and have a lot in common such as DIY philosophy / scratchbuilding / kitbashing, 3D printing, and microcontroller electronics. Model rocket waifu: Pros: nothing beats watching your waifu launch herself 300ft into the air. Cons: She'll get caught in a tree, fall on top of a roof, will get lost easily. Will be made of cardboard, balsa, styrofoam. Analysis: Will not really be much of a waifu, but more of a rocket. Model plane / model helicopter / drone waifu: Pros: Your waifu will be tacticool and be able to spy on your neighbors. Cons: With all the gear she'll probably be heavy and need a license or government permission to take off. Analysis: Any airborne waifu seems to be not in the cards at the moment, all the gyroscopic calculations can be better used to research near-bipedal movement instead. RC car / RC tank waifu:

Message too long. Click here to view full text.

>>7816 >Can we turn anything into a robowaifu? Think about it. Kek. Yeah, we pretty much can Anon. :^)
Found some old document with a list of advantages of fembots / robowaifus. Probably not all my ideas, but partially collected or inspired by others. Might also have redundant points, especially bc my other posting here >>4288. Advantages - younger females - prettier females - fembots stay young - body updates incl. recycling some of the expensive parts - polygamy without jealousy - no conflicts about equal rights - as much mindcontrol as wanted - less costs in time and money (no status symbols, no divorce) - no need of additional social interactions (family, general) - fascination of artificial mind - programmable and adjustable - they are property - they can look whatever we want them to - no need of contraceptives

Message too long. Click here to view full text.

>>9557 Thanks Anon. This kind of material would be well-suited to the Robowaifu User's Guide (or at least as some ad-copy) for when it's written in the future. Thanks.
>>7817 Toasters are the best

ITT: Anons derail the board into debate about Christianity :^) Robowaifu Technician 04/02/2020 (Thu) 02:24:54 No.2050 [Reply] [Last]
I found this project and it looks interesting. Robots wives appeal to me because i'm convinced human woman and humans in general have flaws that make having close relationships with them a waste of energy. I'm a pathetic freshman engineering student who knows barely anything about anything. Honestly, I think current technology isn't at a place that could produce satisfying results for me at least. I'd like something better than an actual person, not a compromise. Even then the technology is there, I have my doubts it'll be affordable to make on your own. Fingers crossed though. Anyway, what kind of behavior would you like from your robot wife? I'd like mine to be unemotional, lacking in empathy, stoic and disinterested in personal gain or other people. I think human woman would be improved if they were like that. Sorry if this thread is inappropriate.
Edited last time by Chobitsu on 04/06/2020 (Mon) 16:00:20.
136 posts and 67 images omitted.
>>12327 Trolls are irrelevant. And I'd say the old Wheat and Tares protocol applies to this question for the most part. Everyone, but everyone knows who get consumed in the fervent heat in the end. Heaven and hell aren't mixed or swapped, evil for good. But instead they are separated as far as the East is from the West Anon.
>>12327 me and my army of waifus would make a great video game plot btw
I mean I understand the danger of putting your consciousness inside a computer or machine, the possibility that a bad actor could just leave you in a permanent state of torment, but that's our paranoid brains working out the worst possible situation. Its reasonable to assume we'd have numerous safeguards and failsafes against such a thing happening and while I'm personally against retreating into "virtual worlds" - if given a choice between aging and dying and having a way to instead slowly replace or augment myself, I would. But I wouldn't do that while I was still young or reasonably healthy, I would try to keep that as long as possible (and that's just a personal preference). Age and life extension are important because of how much wisdom is lost due to our short lifespans we spend too large a % in a tempermental and immature state and that's why we don't get very far civilizationally. If we could fix that, it would be a huge leg up toward a post Kardishev destiny
and of course this relates too to how much time and energy are wasted on the pursuit of foids which could also be put toward that end if R/W of a benchmark quality were readily available as companions
>>12340 What exactly is this?

Open file (891.65 KB 640x360 skinship.gif)
Any chatbot creation step by step guide? Robowaifu Technician 07/17/2021 (Sat) 05:29:42 No.11538 [Reply]
So recently on /tech/ I expressed my interest to start creating my own waifu/partner chatbot(with voice and animated avatar) but wondered whether that is even possible now that I'm stuck with webdev. So one of the anons there pointed to me this board and where I can get started on nueral networks and achieving my dream. And when I came here and looked up the library/guide thread I sort of got very confused because it feels more like a general catalogue than an actual guide. Sometimes things are too advanced for me(like the chatbot thread which two replies in and people are already discussing topics too advanced for me like seq2seq and something else) or other times too basic for me(like basic ANN training which I had already done before and worse the basic programming threads). I know this might feel like asking to be spoonfed but best with me, I've been stuck in a webdev job for an year, so I might not be the smartest fag out there to figure it all myself. >=== -edit subject
Edited last time by Chobitsu on 08/08/2021 (Sun) 21:36:00.
24 posts and 5 images omitted.
>>11780 Alright thanks for the video link. I'd also be interested to hear any response from you on my advice as well.
>>12025 >In that comment I literally wrote, "but I didn't want to try to figure out too many different things just yet." Ah, fair enough then Anon. My apologies. Anyway, thanks for the great contribution ITT now! I take it you've been here on /robowaifu/ for a bit? As far as knowing about robotics, I think that's mostly a matter of just diving into a project and begin learning. One of the things I appreciated most about the Elfdroid Sophie project was watching SophieDev learn things and adapt to issues and design improvements as he went along. Entertaining and educational. But anyway, good luck with your chatbot/waifu projects Anon. I wish you well in them.
>>11555 Starting the Deepmind lectures today anon, thank you.
>>12032 >telling about how useful the resources of this “home brew” club are. This site covers a wide spectrum and is (or was) more focused on building a robotic body than some avatar. Mostly it's about pointing people to ressources to start learning how to do something, often from the scratch, so be more patient with us.
Alright, I've made a quick pass at straightening the mess a few of you created ITT. The posts have been moved either to the WaifuEngine thread >>12270 or to the Lounge. Keep discussions on-topic or move it elsewhere, thanks.

Report/Delete/Moderation Forms
Delete
Report

no cookies?