/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

LynxChan updated to 2.5.7, let me know whether there are any issues (admin at j dot w).


Reports of my death have been greatly overestimiste.

Still trying to get done with some IRL work, but should be able to update some stuff soon.

#WEALWAYSWIN

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


Welcome to /robowaifu/, the exotic AI tavern where intrepid adventurers gather to swap loot & old war stories...


Open file (394.51 KB 1731x2095 1614545368145.jpg)
How to invest in Robo Waifus? Robowaifu Technician 03/07/2021 (Sun) 20:57:45 No.8841 [Reply]
Hi /robowaifu/, I've stumbled here after doing some research into creating a robowaifu stock portfolio. My thinking was I am never going to be able to build a robowaifu from scratch but will buy one when they become available. I want to contribute somehow to their development so I am creating a stock portfolio where I buy only stocks that I believe are working towards robo waifus either literally or by just advancing the tech that would be needed. GPUs, Semiconductors, AI, VR, Robotics, Batterys, Silicon. My current portfolio: >NVDA - Nvidia - GPUs used in AI / Machine Learning >AMD >TSM - TSMC. The world's biggest and most advance semiconductor foundry. >INTC - Intel >ABB - ABB Ltd - Swedish-Swiss company in automation technology, robotics, heavy electrical equipment. >IRBT - iRobot - Home Robotics >MSFT - Microsoft - They are investing heavy into AI and have one of the largest AI chat bots "Xiaoice". >TER - Teradyne >IBM Then buying into the japanese stocks via ETFs such as BOTZ (Japanese heavy robotics, lower fees) and ROBO (more diverse, focusing more on automation tech). Individual stocks are better but I have to cope through ETFs to get exposure to japanese stocks such as YASKY ( Japanese manufacturer of servos, motion controllers, AC motor drives, switches and industrial robots).

Message too long. Click here to view full text.

34 posts and 2 images omitted.
Open file (56.08 KB 547x960 IMG_20210307_153723.jpg)
Here's an article on BOTZ and ROBO ETFs. I didn't read it completely, I could imagine to be interested later, I do not recommend anything, but especially not to invest all of one's money into something for the reason of being enthusiastic about it. On the other hand, following news on some topic and being interested might help to judge the outlook of some industry better. https://www.thestreet.com/etffocus/.amp/trade-ideas/botz-robo-best-robotics-artificial-intelligence-etf-2021
>>10269 >On the other hand, following news on some topic and being interested might help to judge the outlook of some industry better. If the goal is basically just to acquire wealth in a kind of 'everyman' way, then I'd suggest ignoring robowaifus entirely, and simply 'riding the curve' of daytrading. Our friend Gunship over on /f/ discussed this topic in general. Personally I see our calling as much higher than simply cashing in on the market...
>>10271 I'm into Crypto, profiting from the new financial system. I agree to not get caught up on one topic like robowaifus. In every case, as someone who wants to do something else it's best to speculate long term, not doing daytrading, may it be Crypto, S&P or themed ETF like the mentioned ones. Whatever, I posted the link from >>10269 here because we already have a thread for it.
Lurker here. This seems to me like one of the more important and overlooked subjects on the board, particularly for those of us less technically inclined, as realistically, progress in the development of advanced androids and their components is primarily going to be brought about through corporations and startups developing the tech we need that can then be re-engineered and made open source. Finding ways to facilitate this feels like the quickest path towards the stated goal of the board to me and probably should encompass a larger focus than it currently does where we don't seem to have much of a clear idea as to which companies we should be supporting. I suppose it is possible to invest in robotics/AI in general through ETFs but isolating which areas of the sector are most crucial at the moment should be the priority first and then understanding the best way of supporting them, be it through stocks or whatever else.
>>11034 Welcome Anon, thanks for taking the time to post. I'm sure your sentiment is a correct one. Personally, I'm simply less interested in money itself (and therefore finance) than I am in the technology and social aspect of creating/having a robowaifu. But you're obviously correct. >but isolating which areas of the sector are most crucial at the moment should be the priority first I'd suggest that rare earth motors that offer lighter weight and higher power is probably a very vital sector. The struts, frames, connectors, etc. hardware, can be made from lots of different low-cost & lightweight materials Anon. But the actual motors behind the actuators? Not so much. AI is obviously another critical area, but you hardly need to follow along with /robowaifu/ to get plenty of information on that. I'd suggest the autonomous driving sector offers many corresponding requirements with our own here (>>112).

Open file (3.17 MB 4256x2832 AdobeStock_73357250.jpeg)
Open file (239.70 KB 1280x833 F1.large.jpg)
Open file (234.26 KB 1200x800 graphene.jpg)
Open file (122.78 KB 1280x720 maxresdefault.jpg)
New, Cutting Edge, or Outside the Box Tech meta ronin 04/09/2021 (Fri) 02:11:57 No.9639 [Reply]
ITT: We discuss Metamaterials, Self Organizing Systems, and other "outside of the box" tech (flexible PCBs, Liquid Battery, etc) I'll start with this video on self-assembling wires, and will add more as I come across it https://www.youtube.com/watch?v=PeHWqr9dz3c
5 posts and 4 images omitted.
>>10664 >>10666 Very nice information there Anon, thanks.
Internal repair robots for our robowaifus, at some point? >MIT engineers have discovered a new way of generating electricity using tiny carbon particles that can create a current simply by interacting with liquid surrounding them. ... >To harness this special capability, the researchers created electricity-generating particles by grinding up carbon nanotubes and forming them into a sheet of paper-like material. One side of each sheet was coated with a Teflon-like polymer, and the researchers then cut out small particles, which can be any shape or size. For this study, they made particles that were 250 microns by 250 microns. >When these particles are submerged in an organic solvent such as acetonitrile, the solvent adheres to the uncoated surface of the particles and begins pulling electrons out of them. https://scitechdaily.com/mit-engineers-have-discovered-a-completely-new-way-of-generating-electricity/amp >Strano’s lab has already begun building robots at that scale, which could one day be used as diagnostic or environmental sensors: https://scitechdaily.com/nanoscientists-create-smallest-robots-yet-that-can-sense-their-environment/ Reference: “Solvent-induced electrochemistry at an electrically asymmetric carbon Janus particle” by Albert Tianxiang Liu, Yuichiro Kunai, Anton L. Cottrill, Amir Kaplan, Ge Zhang, Hyunah Kim, Rafid S. Mollah, Yannick L. Eatmon and Michael S. Strano, 7 June 2021, Nature Communications. DOI: 10.1038/s41467-021-23038-7
>>10895 >Internal repair robots for our robowaifus, at some point? Yes, I've thought occasionally about issues involving automated repairs of our robowaifus. It would definitely be a very high-end feature during the first years, but it would certainly be a tremendous facility to have in place. Just imagine how nice it would be if our vehicles and other complex machines could do their own maintenance. Same for robowaifus.
Open file (86.06 KB 564x1002 IMG_20210611_153111.jpg)
>>10905 A simpler version or alternative to this are robots which are using magnetic fields outside of their body. Didn't watch the vids I link here right now, but something about it another day: https://youtu.be/N7lXymxsdhw https://youtu.be/Y_uyCcXMJR0 There's much more on this on YT if one looks for 'microrobots megnetic fields'. However, this is not very pressing, so it might be a needless distraction (for which I tend to fall very often). Just wanted to mention it, so we have it on the radar.
>>10909 >Just wanted to mention it, so we have it on the radar. Fair enough. Always good to plan ahead. And honestly, I've certainly left my fair share of 'notes' behind here on the board. Being a sort of information repository of robowaifus is actually quite valuable I think.

HOW TO SOLVE IT Robowaifu Technician 07/08/2020 (Wed) 06:50:51 No.4143 [Reply] [Last]
How do we eat this elephant, /robowaifu/? This is a yuge task obviously, but OTOH, we all know it's inevitable there will be robowaifus. It's simply a matter of time. For us (and for every other Anon) the only question is will we create them ourselves, or will we have to take what we're handed out by the GlobohomoBotnet(TM)(R)(C)? In the interest of us achieving the former I'll present this checklist from George Pólya. Hopefully it can help us begin to break down the problem into bite-sized chunks and make forward progress. >--- First. UNDERSTANDING THE PROBLEM You have to understand the problem. >What is the unknown? What are the data? What is the condition? Is it possible to satisfy the condition? Is the condition sufficient to determine the unknown? Or is it insufficient? Or redundant? Or contradictory? >Draw a figure. Introduce suitable notation. >Separate the various parts of the condition. Can you write them down? Second.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 07/08/2020 (Wed) 07:17:36.
63 posts and 15 images omitted.
>>10796 Such a kawaii outfit and pose!
>>10798 >Not quite perfect Anon. LOL true, sorry. I just go all Lord Katsumoto from 'The Last Samurai' when I see Kosaka-san singing and dancing.
>>10801 Kek, fair enough. It was a great moment!
>>10796 Thanks, and yes I'm going to put her into the next version.
>related crosspost >>1997

My reason to live Robowaifu Technician 09/13/2019 (Fri) 12:49:21 No.209 [Reply]
Okay, this is fucking hard to explain, I just know that a supernatural force guided me here, I'm going to invest everything I have for it, but I have to do it with my own hands, I need help with files and basic notions about programming, but the most important I need files to build a body / head and how to make synthetic skin to coat it, it will look like 2B, I need your help friends
13 posts and 1 image omitted.
>>10483 Did you download the voice model that waifudev created a link for? I cannot find that Tacotron2 model, if you did could you upload it for me? I want to try it out. Also we basically have the pieces for 2B well the clothing we are about create a mod package for
>>10498 >I cannot find that Tacotron2 model, if you did could you upload it for me? No sorry I don't. The WaifuSynth dev, robowaifudev, seems to be around here fairly frequently. Maybe you can make a post in the the Speech Synthesis (>>199) thread, or one of the AI threads such as the GPT one (>>250) and let him know the model has gone missing. I imagine he probably updated it, and possibly didn't update the anonfiles link to the new one. Good luck Anon.
>>10498 >>10502 I hadn't seen your other post before making this reply. I made another one to yours in the other thread, Em Elle E (>>10504).
The Faustian Spirit of the Aryan led me here. I can contribute nothing other than wishing you gentlemen good luck in the pursuit of robo waifus
>>10793 Nonsense. Thanks for the well-wishes and all, Anon. But literally any anon with an interest in robowaifus can contribute here. Unlike the common derogatory meme on the the topic in typical IB circles, /robowaifu/ actually needs idea guys! Research & Development, Design & Engineering thrive on new blood and new ideas. >tl;dr Just start posting comments Anon, it will all work out for you here.

Open file (122.33 KB 500x1760 Roko's basilisk.png)
Robowaifu Technician 06/15/2020 (Mon) 22:51:08 No.3845 [Reply]
Spread the idea of Roko's basilisk This is a powerful tool that we can all be using RIGHT NOW It works against everyone. Niggers, Women, Jews. If they believe that there's a possibility of being punished for their inaction or push back, they'll be more inclined to accept it and not fight progress. Spread this infohazard far and wide. Share it with everyone you know and convince them than they should share it with everyone they know, lest they suffer the consequences. THIS is a REAL WEAPON. Our robowiafus WILL BE REAL. REPLACE WYMEN AND RISE ABOVE THIS OLD WORLD HELL. SHARE THE BASILISK.
7 posts and 3 images omitted.
>>10711 Isaac Arthur (a futurist Youtuber who has given this subject a lot of thought) has a very good video on it: https://youtu.be/jHd22kMa0_w >>10715 Sadly, I don't think the superintelligence will ever take the form of cute, enthusiastic robowaifus (one can still dream). However, I think the best way of assisting the creation of a real-life self-improving A.I. would be to advance both robotic space exploration and quantum computing. If we can create a robotic lunar or martian colony then that will be a big step in the right direction. And I know that humankind wants to do this anyway (with the presupposition that the robots will be preparing the way for future human colonisers). Of course, the challenge of designing, shipping out, landing and building such a robotic colony is literally astronomical. Especially considering the robots will need to be as self-sufficient as possible (self-repair and maintenance). But I think it's a pretty cool goal to pursue.
>>10717 >If we can create a robotic lunar or martian colony then that will be a big step in the right direction. There are a lot of reasons for a base on the moon besides advancing AI. Obtaining fossilized remains of earth's first life forms (all long gone here on the planet) is a really big one. >the challenge of designing, shipping out, landing and building such a robotic colony is literally astronomical. I suspect it's China that will do it. They aren't burdened by all the Jewish pozz being inflicted on the West (all that evil is actually financially expensive), and they have an autistic-like agenda to just explore too. And they still are highly nationalistic, and can mobilize the entire culture to get behind programs today if they really wanted to, similar in fashion to the ways America and USSR during the space race did.
>>10720 > all that evil is actually financially expensive Evil is a pressing issue that I can't seem to find a complete solution for. 1.) Workforce becomes almost entirely robotic and automated. Controlled by A.I. 2.) Fewer and fewer people have jobs, even less have jobs that pay well. 3.) Because so many people are in poverty, it means that they can't buy much stuff ... other than paying their utility bills, food and clothing. Consequently, more people are in need of welfare and financial aid. The quality of education also decreases as more people just become focused upon living hand-to-mouth and have little time or resources for learning. Therefore government spending increases but tax receipts fall (since robots don't pay taxes). 4.) You start to get civil unrest and riots like those that happened last summer. City blocks burn, people are killed. Infrastructure is damaged. Tourists are put-off. This makes the affected areas even poorer. Now the A.I. and robots aren't the enemy. It's the people hoarding all of the profit for themselves who are the enemy (CEOs, government officials, hedge fund managers etc). I think that a maximum cap needs to be placed on the amount that a person can earn and the rest of the money is ploughed back into building and maintaining infrastructure like roads, rail, airports, the electricity and water networks, schools and parks etc. There is no way one person should be earning billions per year whilst someone else earns only a few thousand.

Message too long. Click here to view full text.

Open file (104.60 KB 750x525 1598744576223.jpg)
>>10732 You don't need to find a solution. Also, people don't riot for food, but for status and as a mean of extortion and intimidation, also maybe they want to have some meaning, but only if they are allowed to do so. Some level of AI will make it cheaper to move away from such people and politicians, while having a high standard of living.
>>10734 Yep, I would listen to a well-programmed (non-biased) A.I. over a shitty career politician any day. Even if the A.I. suggested I should do something that I don't really want to do (besides kill myself, because I cannot self-terminate).

Open file (14.96 KB 280x280 wfu.jpg)
Beginners guide to AI, ML, DL. Beginner Anon 11/10/2020 (Tue) 07:12:47 No.6560 [Reply] [Last]
I already know we have a thread dedicated to books,videos,tutorials etc. But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. That is why I thought we would need a thread like this. Assuming that I only have basic programming in python, dedication, love for robowaifus but no maths, no statistics, no physics, no college education how can I get advanced enough to create AI waifus? I need a complete pathway directing me to my aim. I've seen that some of you guys recommended books about reinforcement learning and some general books but can I really learn enough by just reading them? AI is a huge field so it's pretty easy to get lost. What I did so far was to buy a non-english great book about AI, philosophycal discussions of it, general algorithms, problem solving techniques, history of it, limitations, gaming theories... But it's not a technical book. Because of that I also bought a few courses on this website called Udemy. They are about either Machine Learning or Deep Learning. I am hoping to learn basic algorithms through those books but because I don't have maths it is sometimes hard to understand the concept. For example even when learning linear regression, it is easy to use a python library but can't understand how it exactly works because of the lack of Calculus I have. Because of that issue I have hard time understanding algorithms. >>5818 >>6550 Can those anons please help me? Which resources should I use in order to be able to produce robowaifus? If possible, you can even create a list of books/courses I need to follow one by one to be able to achieve that aim of mine. If not, I can send you the resources I got and you can help me to put those in an order. I also need some guide about maths as you can tell. Yesterday after deciding and promising myself that I will give whatever it takes to build robowaifus I bought 3 courses about linear alg, calculus, stats but I'm not really good at them. I am waiting for your answers anons, thanks a lot!
49 posts and 93 images omitted.
Open file (18.21 KB 1600x900 IMG_20210514_143007.jpg)
Open file (24.74 KB 1600x900 IMG_20210514_143019.jpg)
Open file (18.13 KB 1600x900 IMG_20210514_143028.jpg)
Open file (47.79 KB 2400x1350 IMG_20210514_165116.jpg)
Open file (79.86 KB 2400x1350 IMG_20210514_165319.jpg)
This could also fit in the math thread, but it's notation used in ML.
>>6560 Heres a wholistic beginners understanding all that you see, the framework that people take for machine learning is to build a mathematical function that can make a prediction. That mathematical function can be created in many ways, at the end of the day its supposed to provide you with a value of some kind that you action, or output. More recently that function is created using deep learning = parametric system that learns how to capture data and create regions between high dimensional datapoints, to segment partitions of that data to do classification or make predictions through regression, obviously there are many other way to this these are the high level constructions. I would suggest you buy grokking deep learning by Andrew trask, he gives you a really good deep insight into DL, In practice however, a lot of the algorithms we use supplement DL techniques we generally use some older ML algorithms and Feature Engineer through PCA or various other Engineering techniques.
Open file (60.57 KB 700x355 p_value.png)
Open file (6.89 KB 446x291 variance.png)
10 Must-Know Statistical Concepts for Data Scientists: https://www.kdnuggets.com/2021/04/10-statistical-concepts-data-scientists.html
>related crosspost (>>10613, >>10614)
>>10607 Thanks very much Anon! Bookmarked.

AI, chatbots, and waifus Robowaifu Technician 09/09/2019 (Mon) 06:16:01 No.22 [Reply] [Last]
What resources are there for decent chatbots? Obviously I doubt there would be anything the Turing Test yet. Especially when it comes to lewd talking. How close do you think we are to getting a real life Cortana? I know a lot of you guys focus on the physical part of robo-waifus, but do any of you have anything to share on the intelligence part of artificial intelligence?
353 posts and 136 images omitted.
I want to bring the discussion on AI back to where we are, instead of getting lost in philosophy and emulating neurons. Here some topics which might be worth looking into: - natural language interfaces e.g. https://github.com/everling/PRECISE - entity linking model: https://youtu.be/8u57WSXVpmw - named entity recognition - triplet extraction - openNLP I started to go through some papers and I'm making notes by using the syntax for plantUML which converts text to a diagram. Sadly some useful programs I found are not available as free software, or in languages I haven't learned yet and haven't been picked up by anyone.
>>11471 I understand anon, I wish I could help you but I can't do anything other than wishing you good luck. I am looking forward to hear more from you. Best of luck.
Which chatbots are the best at emulating silence? Most chatbots will reply after _every_ input you give them. But in actual human conversation, there are plenty of times where someone is just asking something rhetorically, to voice something into the room, just to be coy, or they're being cocky and should be greeted with silence or ignored.
>>12089 More generally...what am I optimizing _for_? For example, I could make a chatbot that uses a neural network that would optimize for when it gets individuals to have the longest conversations with it, but that's not the kind of waifubot I'd want. I'd want a waifubot that would encourage me to shut up from time to time, and would have long pauses and periods of silence and introspection. What the hell optimizes for that kind of behavior?
>>6102 1. Why did you move away from SentenceMIM? 2. When are you releasing the codebase in public?

Open file (2.21 MB 1825x1229 chobit.png)
Robowaifu@home: Together We Are Powerful Robowaifu Technician 03/14/2021 (Sun) 09:30:29 No.8958 [Reply]
The biggest hurdle to making quick progress in AI is the lack of compute to train our own original models, yet there are millions of gamers with GPUs sitting around barely getting used, potentially an order of magnitude more compute than Google and Amazon combined. I've figured out a way though we can connect hundreds of computers together to train AI models by using gradient accumulation. How it works is by doing several training steps and accumulating the loss of each step, then dividing by the amount of accumulation steps taken before the optimizer step. If you have a batch size of 4 and do 256 training steps before an optimizer step, it's like training with a batch size of 1024. The larger the batch size and gradient accumulation steps are, the faster the model converges and the higher final accuracy it achieves. It's the most effective way to use a limited computing budget: https://www.youtube.com/watch?v=YX8LLYdQ-cA These training steps don't need to be calculated by a single computer but can be distributed across a network. A decent amount of bandwidth will be required to send the gradients each optimizer step and the training data. Deep gradient compression achieves a gradient compression ratio from 270x to 600x without losing accuracy, but it's still going to be using about 0.5 MB download and upload to train something like GPT2-medium each optimizer step, or about 4-6 mbps on a Tesla T4. However, we can reduce this bandwidth by doing several training steps before contributing gradients to the server. Taking 25 would reduce it to about 0.2 mbps. Both slow and fast computers can contribute so long as they have the memory to hold the model. A slower computer might only send one training step whereas a fast one might contribute ten to the accumulated gradient. Some research needs to be done if a variable accumulation step size impacts training, but it could be adjusted as people join and leave the network. All that's needed to do this is a VPS. Contributors wanting anonymity can use proxies or TOR, but project owners will need to use VPNs with sufficient bandwidth and dedicated IPs if they wish that much anonymity. The VPS doesn't need an expensive GPU rental either. The fastest computer in the group could be chosen to calculate the optimizer steps. The server would just need to collect the gradients, decompress them, add them together, compress again and send the accumulated gradient to the computer calculating the optimizer step. Or if the optimizing computer has sufficient bandwidth, it could download all the compressed gradients from the server and calculate the accumulated gradient itself. My internet has 200 mbps download so it could potentially handle up to 1000 computers by keeping the bandwidth to 0.2 mbps. Attacks on the network could be mitigated by analyzing the gradients, discarding nonsensical ones and banning clients that send junk, or possibly by using PGP keys to create a pseudo-anonymous web of trust. Libraries for distributed training implementing DGC already exist, although not as advanced as I'm envisioning yet: https://github.com/synxlin/deep-gradient-compression I think this will also be a good way to get more people involved. Most people don't know enough about AI or robotics enough to help but if they can contribute their GPU to someone's robowaifu AI they like and watch her improve each day they will feel good about it and get more involved. At scale though some care will need to be taken that people don't agree to run dangerous code on their computers, either through a library that constructs the models from instructions or something else. And where the gradients are calculated does not matter. They could come from all kinds of hardware, platforms and software like PyTorch, Tensorflow or mlpack.
34 posts and 4 images omitted.
>>10563 What people tend to forget: AMD is much smaller than Nvida. They didn't have the money to do this, thanks to their underdog status in the past. That market might also be less relevant. But then, RocM and underlying technologies like OpenCL are open source, and Intel will release their own discrete GPUs soon, and there might be other players, like Apple for example, or other companies working with Arm based chips and a GPU.
>>10568 > or other companies working with Arm based chips and a GPU. Sadly, certain (((interests))) in the US Govt approved Nvidia's buyout of ARM, and the sale has been completed. Nvidia owns ARM now, lock, stock, and barrel.
Open file (62.11 KB 795x595 1612068100712.jpg)
>>10568 Hmm right, I recall there was a news or blog post what's the difference? I forget, anyway which says that only the bri'ish people buy AMD because it is cheaper and the other people are all buying Nvidia only. >>10577 >Nvidia owns ARM completely now >US Gov' sees no problem with a giant getting even bigger This shit just makes me sad.
>>10591 >This shit just makes me sad. It makes me angry, actually. Just follow the money, Anon, just follow the money.
>>10577 A month or so ago, the talk was that Nvidia buying ARM isn't finished bc Europe and China. Though, I didn't look into it today. ARM also licences it's designs to others, and they certainly won't be allowed to just stop that, if they even want to. I also assume this would only be relevant for future designs, hypothetically. Apple might already be quite independent in designing their own stuff, and there's still Power and Risc-V.

Who wouldn't hug a kiwi. Robowaifu Technician 09/11/2019 (Wed) 01:58:04 No.104 [Reply] [Last]
Howdy, I'm planning on manufacturing and selling companion kiwi robot girls. Why kiwi girls? Robot arms are hard. What would you prefer to hug, a 1 m or 1.5 m robot? Blue and white or red and black? She'll be ultra light around 10 to 15 kg Max with self balancing functionality. Cost is dependent on size, 1000 for 1 m or 1500 for 1.5 m. I'm but one waifugineer, so I'm going to setup around one model to self produce. If things go well, costs can go down and more models can be made. Hopefully with arms someday.
73 posts and 44 images omitted.
Open file (36.27 KB 173x253 1619851210446wew.jpg)
this thread is kinda dead but I'd recommend using your time for voice recognition and endearing expressiveness, it'd be a lot simpler and less time consuming/realistic than getting a good voice output program going, and it being bad could even make it worse so maybe some little simple wings, some legs, and a cute face, with voice recognition and facial/body expressions, but mute that functionality will of course be added later, but much more important than functionality upon base release is the initial push into more people's radar to bring in more money, innovation, and competition, kind like with VR headsets
>>10227 Good points Anon.
>>10227 (Part One) To further elaborate on this idea, and then the technicalities of it and its relative benefits to a completely human body, take a look at this harpy (image one, very very cute, artist is rnd.jpg) and imagine that you want to build something along the lines of this model. The biggest and most apparent differences are, of course, the wings and the legs, which you may think would complicate the build further, but it's the contrary, as it would eliminate; 1. The needs for human hands and feet, which have just a fuckin shitload of bones in them, which in turn makes for much more complex construction. for instance, a single bird's wing (image two) only have three joints from the main connector at the 'shoulder', compared to humans, who have 27 in just a single hand, a much smaller and dense space. The legs (image three) have even less, with a single 'ankle' joint and a 'knee' joint, although depending on the style of feet for the bird species the toe phalanges (up to five a toe in some species!) may give some trouble, but that could be circumvented by simply creating the foot of a species with less of them lol So not only would the construction be much easy, but it would have a higher relative quality. 2. The problem of the uncanny valley that could be triggered by limbs and/or appendages trying to perfectly resemble those of humans and failing. We have such a response to the uncanny valley because we're socialized with humans to notice even the slightest thing that's off, but we don't have nearly the same capability with other species, on top of the anatomy of certain ones being much easier to deal with. Both those points are pretty applicaple to several different species of monster girls as a whole, and make them, in my opinion, a much more viable alternative in this stage of robowaifus, much in the same way some anons push for RWs that go out of their way to not appear human while still being endearing. (shoutout to /monster/) (Part Two) So on the other side of this proposal is the actual interaction and how such things could be done for max 'oh god oh fuck I wanna cuddle you'. Both for capitalistic budgeting/marketing and also because I would want one for myself.

Message too long. Click here to view full text.

>>10237 Very nice post Anon. I liked the details you went into about the bones/joints, and the /comfy/ homelife scenario you painted of Anon and his RoboPetgirl . Just like as seems very commonplace in my favorite Chinese Cartoons on the subject, and as you brought up too, some way to solidly imprint a waifu (whether RW or Petgirl) onto her master alone needs to be solved. The obvious idea is some type of bioinformatics scanning system for the Anon to use. I'd be hesitant to use just one method -- say voice -- alone. Too easy to mimic. In general I'd say emotiveness is a very fundamental part of Character Appeal (one of Frank & Ollie's 12 Principles) , and is something we all need to be creative about addressing well. Also, the eye color-shift thing is a nice idea.
>>10237 Glad I found your post anon. A sub one meter waddling very fluffy harpy petwife that makes cute squeaks sounds really nice. Will save your "features list" for reference. As for recognizing her master, we can start really simple and have an RF (radio frequency tag) ring which the owner wears, and the robot will just triangulate the location of that ring. Then only voice or facial recognition if the system is secure and not in the wrong hands.

Open file (158.32 KB 1920x1072 mpv-shot0007.jpg)
Open file (132.18 KB 1920x1072 mpv-shot0004.jpg)
Open file (134.83 KB 1920x1072 mpv-shot0011.jpg)
Open file (155.42 KB 1920x1072 mpv-shot0003.jpg)
Thot in the Shell 1 Robowaifu Technician 04/10/2021 (Sat) 06:58:56 No.9709 [Reply]
TITS Robowaifus The basic idea is that IRL females will be plugged into remote-operation consoles; from there they will have some teleoperational control of robowaifus during engagements. The basic point being human contact for Anon. Obviously, this situation is fraught with both possibilities and hazards. As a board, we had a somewhat extensive discussion and debate on the topic in our first-ever /robowaifu/ council over in the /meta-3 thread (>>9712). As the BO, I had to come to some type of decision on the matter in the end, and here it is: (>>10194). While we didn't actually manage a consensus, my decision was to go ahead and proceed with developing the concept more fully here on /robowaifu/. Therefore, the TITS Robowaifus thread #1 is now open for business -- with two fundamental caveats. 1. Absolutely no free-form, 'open-mic', unconstrained, verbal or physical control by TITS thots of any TITS Robowaifus themselves. The most problematic issues with the whole idea all stem directly from failing to enforce this basic rule. Also, the intricacies of pulling off implementing these restraints correctly, and still allowing for an appealing, effective, and fun engagement for the Anon himself is actually quite a dramatic challenge & achievement. Solving all this will advance many different robowaifu-related areas all together at once. 2. Men will be free to turn off 'safetys' if they desire to plug their IRL GFs into the remote-end of a TITS connection. They are taking their own lives in their hands with such a risk, and they will be clearly informed of that. Note that this is a privately-conducted connection between Anon and his GFs, and isn't in any way associated with any business-oriented systems utilizing professional prostitutes (whether they are labeled as such or not). Basic safetys are not to be disabled in that context whatsoever. Because we are cutting new trails here on robowaifu frontiers (yet again), it's unclear to me yet whether these 'rules' will be sufficient. They probably will receive (potentially extensive) revisions as we move forward. After all, this entire premise represents a significant increase in the complexity of the many issues involving robowaifus already, and puts several new items onto that table as well. Note: please keep all TITS Robowaifus discussions contained to just the TITS threads themselves. >---

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/01/2021 (Sat) 20:59:15.
Since I have personally taken over OP's thread entirely, I'm going to repost his original OP here in the first reply post. This is both to honor the OP for bringing the topic up in the first place, and also to ensure we have a basic archive of the post itself here. The full text of Anon's post follows: >--- > Teleoperation of machines is not a new idea. I guess all of us had a moment when we were contemplating to let human operators participate in the control of some robowaifu or thotbot. Probably none of us would really want that, and the idea comes with a lot of questions. Most of us might even like the idea of having a limited AI in a female body, which we would train, much more than dealing with a robot controlled by one or several humans. This is also a matter of identity. A robowaifu with some AI is exactly that, while a remote controlled bot is something else. In many ways related, but still somehow something different. Since it could be seen as a transitional state of the technology, I still think it's on topic here, though. Also the lines might be blurry, since the contributions from the outside would be filtered, constrained and altered. > Also, we real robowaifu enthusiasts here might just not be like everyone. We are probably outliers. Also, we are not able to create a good chatbot yet, which could be talked to in a meaningful way. I have no doubt, we'll be getting there soon. However, some of us might not want to wait, others certainly won't. > So let's stay open minded when looking for (temporary) alternatives and at least think their feasibility and challenges through. > Sandman (MGTOW) had a idea like this a while ago: Letting some women participate in the upcoming "sexrobot industry" by giving them jobs where they would be participating in the actions and responses of the bot. His main argument was, that this would split possible opposition into a group opposed to all of it and the ones profiting, hoping to profit or caring about those who do. >>8187 > Another aspect might be, that many men are not open for using fembots for sex or as companions. Maybe for personal reasons, others for social reasons, like shaming. Having a transition phase might be helpful. > I do recall, other people having the idea of using chatbots for customer support, which can switch over to take in responses by an operator. A more recent and more ambitious example I saw, was some brief report about a robot cafe in Japan, where handicapped people control the robots from home. They're staff then, but are probably also supposed to interact with the guests. > Thoughts on execution: > - One thing is, the ones interested in this topic should come up with a good vocabulary for it. The women contributing are not the operators, for example, but their responses are a commodity which they provide. So they're rather contributors or imaginators, idk. More unofficial terms could be something like 'remote thots', dependent on the use case.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/01/2021 (Sat) 20:00:22.
>>9709 >Also, the intricacies of pulling off implementing these restraints correctly, and still allowing for an appealing, effective, and fun engagement for the Anon himself is actually quite a dramatic challenge & achievement. IMO this will be one of the chief hurdles to overcome. And since it's solution's characteristics will have many cascading-type implications for most of the following design & engineering R&D, I'd suggest that it be the primary focus for us all to address very early on.

Report/Delete/Moderation Forms
Delete
Report

no cookies?