Who's the AI? in Future of AI

Here's an interesting video, and the giveaways were pretty obvious but it's still entertaining. The Mozart delay was weird I asked the same question to the WaifuChatbot where the system prompt was to pretend it was Mozart the composer. It answered instantly, this was on a Tesla T4.

Reverse Turning

Started November 04, 2024, 05:45:05 am

Will LLMs ever learn what is ... is? in Future of AI

Large Language Model Tests




INPUT:  X is Y. What is Y?
OUTPUT: Without any context or additional information, it's impossible to determine what Y is. Please provide more details.


INPUT:  X equals Y. What does Y equal?
OUTPUT: Y equals X.


INPUT:  Substitute "is" with "equals" and answer this: X is Y. What is Y?
OUTPUT: Y equals X.

Citation:
Worswick, S. (n.d.). Steve Worswick’s Post, Linkedin.

« Last Edit Reason:  Added APA style citation.

5 Comments | Started October 17, 2023, 06:34:51 pm

Project Acuitas in General Project Discussion

Block diagram of Acuitas, the symbolic cognitive architecture:



<<< EDIT >>>
Since this is the post that pops up on the home page every time my thread gets bumped, I'm editing it to drop the most recent block diagram here. The info that was originally in this post, I will move into the second post of the thread.
<<< EDIT >>>

300 Comments | Started June 02, 2017, 03:17:30 pm

Ai improving AI in AI Programming

I'm building my multimodal system with modular AI models. I then realized that I need long-term memory to make this much more effective as a companion bot. So, I asked ChatGpt to solve this problem.

Code
I want this approach to also have long-term memory, so if I ask it what did we do last month it could summarize that month. Is this possible?
ChatGPT said:
ChatGPT
Memory updated
Yes, it's possible to implement long-term memory in an AI system so that it can remember and summarize past interactions, even over longer periods like a month. To achieve this, we can build a system that saves conversation logs, extracts key events or topics, and stores them in a retrievable, structured format. When you ask, "What did we do last month?" the system would access this memory and summarize the important points.

Key Steps to Implement Long-Term Memory:
Memory Storage:

You need a persistent database to store the conversation history. This can be structured in a way that makes it easy to retrieve past conversations based on the timestamp or content.
You could use a simple database like SQLite or a cloud-based solution for more scalability (e.g., Firebase, AWS DynamoDB).
Summarization:

For each conversation, the system can store summaries or important keywords so that it can later condense a month’s worth of data into key points.
You can use NLP-based summarization models to extract concise summaries from each conversation.
Indexing by Time and Topic:

Index the data by date and topic. This way, if you ask about specific time frames ("last month"), the system can filter relevant conversations from that period.
Conversations could also be categorized (e.g., personal, philosophical, work) for better context retrieval.
Recall System:

When you ask a question about past events, like "What did we discuss last month?", the system can scan its memory for that time frame, retrieve key topics, and provide a high-level summary.
Example of How to Structure the System:
Step 1: Memory Logging
Whenever you have a conversation, the system can log it like this:

python
Copy code
import datetime
import sqlite3

class MemoryManager:
    def __init__(self):
        # Initialize SQLite database
        self.conn = sqlite3.connect('conversations.db')
        self.create_table()

    def create_table(self):
        # Create a table to store conversation logs
        with self.conn:
            self.conn.execute('''CREATE TABLE IF NOT EXISTS conversations
                                (id INTEGER PRIMARY KEY AUTOINCREMENT,
                                 timestamp TEXT,
                                 summary TEXT,
                                 full_text TEXT)''')

    def log_conversation(self, conversation_text, summary):
        """Log the conversation with a timestamp and summary."""
        timestamp = datetime.datetime.now().isoformat()
        with self.conn:
            self.conn.execute("INSERT INTO conversations (timestamp, summary, full_text) VALUES (?, ?, ?)",
                              (timestamp, summary, conversation_text))

    def retrieve_conversations(self, start_date, end_date):
        """Retrieve conversations within a specific time frame."""
        with self.conn:
            cur = self.conn.cursor()
            cur.execute("SELECT timestamp, summary FROM conversations WHERE timestamp BETWEEN ? AND ?",
                        (start_date, end_date))
            return cur.fetchall()

# Example usage
memory_manager = MemoryManager()

# Log conversation
conversation_text = "We talked about the new AI project and discussed ethical implications."
summary = "Discussion on AI project and ethics."
memory_manager.log_conversation(conversation_text, summary)

# Retrieve conversation from last month
start_date = '2023-09-01T00:00:00'
end_date = '2023-09-30T23:59:59'
logs = memory_manager.retrieve_conversations(start_date, end_date)
for log in logs:
    print(log)
Step 2: Summarizing Conversations
You can use an NLP model like BART or GPT-based summarizers to generate summaries of each conversation as they are logged:

python
Copy code
from transformers import pipeline

# Initialize the summarization model
summarizer = pipeline("summarization")

def generate_summary(text):
    """Generate a summary of the conversation."""
    summary = summarizer(text, max_length=50, min_length=25, do_sample=False)
    return summary[0]['summary_text']

# Example usage
conversation = "Today we discussed our plans for launching the new product and how it might impact the market. ..."
summary = generate_summary(conversation)
print(summary)  # This will be saved in the memory log.
Step 3: Retrieving and Summarizing a Time Frame
To answer questions like "What did we discuss last month?" you can retrieve and combine conversations from that time period, then summarize them.

python
Copy code
def summarize_monthly_conversations(memory_manager, month, year):
    """Summarize all conversations from a specific month."""
    start_date = f'{year}-{month:02d}-01T00:00:00'
    end_date = f'{year}-{month:02d}-30T23:59:59'
   
    # Retrieve all conversations from the database
    logs = memory_manager.retrieve_conversations(start_date, end_date)
   
    # Combine summaries
    combined_text = " ".join([log[1] for log in logs])
   
    # Summarize the entire month
    monthly_summary = generate_summary(combined_text)
    return monthly_summary

# Example usage for summarizing September 2023
monthly_summary = summarize_monthly_conversations(memory_manager, 9, 2023)
print(monthly_summary)
Key Features of This System:
Long-Term Memory: All conversations are logged and indexed by time, allowing for easy retrieval of past interactions.

Automatic Summarization: Each conversation is automatically summarized when logged, allowing the system to quickly summarize larger time periods like weeks or months.

Contextual Understanding: By combining summaries from a time frame, the system can give meaningful overviews, answering queries like "What did we discuss last month?" in a concise manner.

Possible Expansions:
Topic-Based Search: Allow the system to search for specific topics (e.g., "What did we discuss about AI last month?").
User-Defined Tags: Enable the user to label or highlight important conversations for easier retrieval later.
Cloud-Based Storage: For scalability and availability across devices, you can store the memory in a cloud database like Firebase or DynamoDB.
This will give your system the ability to maintain a long-term, searchable memory and respond to queries about past interactions efficiently. Would you like to explore further customization of this memory system?






You said:
It's okay, but I just realized that the system should have a sense of its past in order to have a more realistic conversation. For example, if a User says, "I'm going to raise that fence another three feet, " the AI responds, "But remember you said last year that the city codes prevent you from raising that fence." Is this possible?
ChatGPT said:
ChatGPT
Memory updated
Yes, it's possible to give the AI system a sense of its past, so it can remember previous interactions and bring them up naturally in future conversations. This would make the system feel more realistic and responsive. To implement this, the AI needs to track conversation history in more detail, including facts, decisions, and constraints mentioned by the user, and use these inferences when generating responses.

Here’s how this can be done:

Key Components:
Memory Database:

Store key facts and statements from conversations in a memory database. For instance, if the user says something important like “city codes prevent raising the fence,” the system would store this information as a retrievable fact.
Contextual Fact Retrieval:

Whenever a new conversation happens, the AI analyzes the current context (e.g., user talking about fences) and retrieves relevant facts from past conversations. It then uses these facts in its responses.
Natural Recall Mechanism:

The AI should dynamically decide when to bring up past information, based on relevance. For example, if the topic of fences comes up again, the AI should remember the constraint the user mentioned last year.
How to Implement:
Step 1: Store Key Facts in Memory
The system should recognize and log important facts from user conversations. You can use NLP techniques to identify statements that involve user plans, constraints, or decisions.

python
Copy code
import sqlite3
import datetime

class MemoryWithFacts:
    def __init__(self):
        # Initialize SQLite database
        self.conn = sqlite3.connect('facts_memory.db')
        self.create_table()

    def create_table(self):
        with self.conn:
            self.conn.execute('''CREATE TABLE IF NOT EXISTS facts
                                 (id INTEGER PRIMARY KEY AUTOINCREMENT,
                                  timestamp TEXT,
                                  fact TEXT,
                                  topic TEXT)''')

    def store_fact(self, fact, topic):
        """Store an important fact related to a specific topic."""
        timestamp = datetime.datetime.now().isoformat()
        with self.conn:
            self.conn.execute("INSERT INTO facts (timestamp, fact, topic) VALUES (?, ?, ?)",
                              (timestamp, fact, topic))

    def retrieve_facts(self, topic):
        """Retrieve relevant facts related to a specific topic."""
        with self.conn:
            cur = self.conn.cursor()
            cur.execute("SELECT fact FROM facts WHERE topic = ?", (topic,))
            return cur.fetchall()

# Example usage
memory_manager = MemoryWithFacts()

# Store a fact about fences
memory_manager.store_fact("City codes prevent raising the fence", "fence")

# Retrieve facts about fences
fence_facts = memory_manager.retrieve_facts("fence")
for fact in fence_facts:
    print(fact[0])
Step 2: Detect Important Facts During Conversations
You can use a rule-based system or NLP model to detect important user statements. For example, if a user mentions something like city codes, it could be flagged as a fact worth remembering:

python
Copy code
from transformers import pipeline

fact_classifier = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")

def detect_important_fact(conversation_text):
    """Detect important facts to store in memory."""
    candidate_labels = ["constraint", "decision", "plan", "preference"]
    result = fact_classifier(conversation_text, candidate_labels)
   
    # If a certain label has high confidence, store the fact
    if result['labels'][0] in ['constraint', 'decision']:
        return conversation_text

# Example conversation
conversation = "The city codes prevent me from raising the fence."
fact = detect_important_fact(conversation)
if fact:
    memory_manager.store_fact(fact, "fence")
Step 3: Context-Aware Fact Recall
When a new conversation happens, the system should automatically recall relevant past facts. If the user mentions "fence," the AI will retrieve facts related to that topic and bring them up naturally:

python
Copy code
def respond_with_context(conversation_text, memory_manager):
    """Respond to the user by recalling past facts if relevant."""
    topic = "fence"  # This can be inferred from the conversation context using NLP.
    facts = memory_manager.retrieve_facts(topic)
   
    if facts:
        # Example response using the past fact
        return f"But remember, you mentioned last year that the city codes prevent you from raising the fence."
    else:
        return "Tell me more about your plans for the fence."

# Example conversation
conversation = "I'm going to raise that fence another three feet."
response = respond_with_context(conversation, memory_manager)
print(response)  # Expected output: "But remember, you mentioned last year that the city codes prevent you from raising the fence."
Step 4: Advanced Memory Features
Time-Based Recall: Facts could be tagged with timestamps, so the AI can mention when the user originally said something (e.g., “you mentioned last year”).

Topic Tracking: More advanced topic modeling could be used to automatically detect and track topics, so the user doesn’t need to explicitly mention a topic every time.

Contextual Matching: The system could also analyze partial matches between the current conversation and past facts. Even if the user doesn’t directly mention “fence,” the system could infer the relevance if the conversation context is similar.

Result
This system will give the AI the ability to naturally refer back to important user statements from previous conversations. It will enhance the AI's responsiveness by making it feel like it has a consistent memory, leading to more realistic and contextual interactions. Would you like to explore any other aspects of this memory system?






You said:
how can NLP infer the topic?
ChatGPT said:
ChatGPT
NLP can infer topics from text using a variety of techniques, from simple keyword extraction to more advanced machine learning methods. The goal is to identify the main subject or theme of a given conversation. Below are some methods that NLP uses to infer topics:

1. Keyword and Phrase Extraction
TF-IDF (Term Frequency-Inverse Document Frequency): This is a statistical technique that ranks words based on how frequently they appear in a document compared to other documents. Keywords that have a high TF-IDF score are considered more important.
Noun-Phrase Extraction: NLP models can identify noun phrases (e.g., "fence height," "city codes") as potential topics by using POS tagging (part-of-speech tagging), which classifies each word as a noun, verb, etc.
Example:

python
Copy code
from sklearn.feature_extraction.text import TfidfVectorizer

# Example set of conversations
conversations = ["I'm going to raise that fence another three feet.",
                 "City codes prevent me from changing the fence.",
                 "Let's talk about AI and how it impacts society."]

# Use TF-IDF to extract keywords
vectorizer = TfidfVectorizer(stop_words="english")
tfidf_matrix = vectorizer.fit_transform(conversations)
feature_names = vectorizer.get_feature_names_out()

# Show keywords from first conversation
dense = tfidf_matrix.todense()
keyword_list = dense[0].tolist()[0]
phrase_scores = [pair for pair in zip(range(0, len(keyword_list)), keyword_list) if pair[1] > 0]
keywords = [feature_names[index] for index, score in phrase_scores]
print(keywords)  # This could return ['fence', 'three', 'feet']
2. Named Entity Recognition (NER)
NER is a method that identifies entities in text, such as people, organizations, locations, etc. If a specific named entity is consistently mentioned in the conversation, it can be inferred as the topic.

Example:

python
Copy code
from transformers import pipeline

ner = pipeline("ner")

conversation = "The city codes prevent me from raising the fence."
entities = ner(conversation)
print(entities)  # Expected output: [{'entity': 'LOC', 'word': 'city'}, {'entity': 'MISC', 'word': 'codes'}]
Entity categories can then be linked to potential topics, such as:

LOC (Location) could point to discussions about a place.
MISC (Miscellaneous) could point to rules, regulations, etc.
3. Topic Modeling with LDA (Latent Dirichlet Allocation)
LDA is a popular unsupervised technique used to detect topics in text. It assumes that each document is a mixture of various topics, and it tries to infer the most probable topics that generated the text.

Example with gensim:

python
Copy code
from gensim import corpora
from gensim.models import LdaModel

# Example conversations
documents = ["I'm going to raise that fence another three feet.",
             "City codes prevent me from changing the fence.",
             "Let's talk about AI and its impact on society."]

# Preprocess the text
texts = [[word for word in document.lower().split()] for document in documents]
dictionary = corpora.Dictionary(texts)
corpus = [dictionary.doc2bow(text) for text in texts]

# Train LDA model
lda = LdaModel(corpus, num_topics=2, id2word=dictionary, passes=15)
topics = lda.print_topics(num_words=3)
print(topics)
LDA will output something like:

graphql
Copy code
[(0, '0.2*"fence" + 0.1*"codes" + 0.05*"three"'),
 (1, '0.3*"ai" + 0.1*"impact" + 0.05*"society"')]
This suggests two topics: one about "fence" and "codes," and another about "AI" and "impact."

4. Zero-Shot Topic Classification
With zero-shot learning models, you don’t need to train the model on specific topics in advance. Instead, the model can infer the most likely topic based on predefined candidate topics provided at runtime.

Example using Hugging Face's zero-shot-classification:

python
Copy code
from transformers import pipeline

classifier = pipeline("zero-shot-classification")

conversation = "The city codes prevent me from raising the fence."
candidate_labels = ["construction", "legal issue", "personal development", "AI"]

result = classifier(conversation, candidate_labels)
print(result)
Expected output:

arduino
Copy code
{'labels': ['legal issue', 'construction', 'personal development', 'AI'],
 'scores': [0.9, 0.7, 0.1, 0.05]}
Here, "legal issue" is the most likely topic, inferred from the discussion of "city codes."

5. BERT-Based Semantic Similarity Matching
Another approach is to use models like BERT to compute the semantic similarity between the current conversation and predefined topic categories. The model matches user statements to the closest topic based on meaning, even if exact words aren’t used.

Example:

python
Copy code
from sentence_transformers import SentenceTransformer, util

model = SentenceTransformer('paraphrase-MiniLM-L6-v2')

# Example conversation and topic list
conversation = "The city codes prevent me from raising the fence."
candidate_topics = ["construction issues", "legal restrictions", "AI development", "personal growth"]

# Encode conversation and topics
conversation_embedding = model.encode(conversation, convert_to_tensor=True)
topic_embeddings = model.encode(candidate_topics, convert_to_tensor=True)

# Compute similarity
similarities = util.pytorch_cos_sim(conversation_embedding, topic_embeddings)
best_match = candidate_topics[similarities.argmax()]
print(best_match)  # This should return 'legal restrictions' or 'construction issues' depending on the context.
6. Rule-Based Systems
For more structured and specific use cases, you can manually define rules to classify topics based on keywords or patterns.

Example:

python
Copy code
def rule_based_topic_detection(conversation_text):
    if "fence" in conversation_text.lower():
        return "construction"
    elif "city codes" in conversation_text.lower():
        return "legal issue"
    else:
        return "general"

# Example
conversation = "The city codes prevent me from raising the fence."
topic = rule_based_topic_detection(conversation)
print(topic)  # Output: "legal issue"
Conclusion
Using a combination of these methods, NLP can infer the topic of a conversation with varying degrees of accuracy, depending on the complexity and type of conversation. For a flexible system, combining zero-shot learning, LDA, and keyword extraction might give the best results in most general cases, while more structured systems can use rule-based approaches for highly specific domains. Would you like to implement one of these approaches for your system?

3 Comments | Started October 19, 2024, 01:30:48 am

Atronach's Eye in Home Made Robots

Guess I haven't started a project thread for this one yet? Anyway, I put a bunch of work into the mechanical eyeball this year. It's been tougher to get it working perfectly than I expected (when is it not?), but I'm getting closer. Lots of photos and design discussion in the blog, so much I won't try to copy it all here: https://writerofminds.blogspot.com/2023/11/atronachs-eye-2023.html

Problems worked on this year and late last year include slack in the tendons, too much friction between ball and socket, limited range of motion and limit-of-motion sensing.

8 Comments | Started November 21, 2023, 06:39:37 pm

Running local AI models in AI Programming

I'm using Lm Studio as a server and have used it as an app as well, but the LLMs out there are outstanding! They are getting smaller and are competitive with online solutions like Replika! Also, the ability to operate these without NSFW filters makes them great when plotting to rob banks or murders! LoL Or at least creating a script or novel along those lines, even horror and intimacy interactions are off the charts!

So, the ability to do other types of local models such as voice where solutions like parler-tts and tortoise-tts have excellent voice abilities where you can even customize them to whoever's voice you like! Also, Whisper can do the opposite STT, and no censorship! Also, there are photo and video solutions like LLaVA-NeXT where the AI can create an impression or create images and videos based on prompts.

Here's the good part integrating these into a system that can see, hear, and imagine is a reality, taking each output and prompting the other provides for a kind of feedback approach to create...well, some might argue, but a persona. Enhancing the prompts with other types of data and even using some causality models we might just get that person from science fiction and all done from a PC!

What's required on the local machine is more than one GPU, where RTX 4070 ti supers are selling for $650, but you mix and match what you want where perhaps using an RTX 4090 for image and video is best and apply the RTX 4070 ti to do the rest. With three GPUS with just the minimum of an RTX 4070 ti that's 50GB of ram! But perhaps you need more since you may what a VR setup as well and give your bot a virtual body!

It's just freaking fantastic what is possible today and it's free from the clutches of politically correct censorship. Let your imagination go and apply your skills towards integration and you could very well build a very sophisticated competitor to ChatGPT 40 that runs at home.

Now what's a challenge is the development of a hardbody, animatronic facial expressions (the generated prompts from LLM models are freaking great, they could be used to control expressions and even position a body!)

It's a great time for the enthusiast, for a while now I thought everything was going to be locked up in the corporate cloud, controlled through a pay interface, and monitored by Big Brother, but America proves itself to be the land of freedom, and the industry has opened up to the little guy...

3 Comments | Started October 04, 2024, 11:51:42 pm

Hi IM BAA---AAACK!! in Home Made Robots

https://www.youtube.com/watch?v=FJvHw283VP4

So this robot, is the one im going to build,   this is just an open s-cad render procedural generation cad script and it knocked my socks off how good it turned out!

Its a hydraulic one,  im not sure if my pipes and actuators are big enough,  but I'll be doing a test actuator first, (just the leg with a single actuator) this one is nearly done, just a bit more work to go before I go try to build it.

If I had access to a 2 metre 3d printer I could print it out pretty easy, (the body is only 20 centremeters but the legs are just too long) but I dont, so I've actually got a technique organized which is going to use a large big metre 2d print instead, which just mostly is just the pipes and actuators and then  build the shell off it from there,  in a top half and bottom half that just mirrors together,  should be pretty similar,   I just mainly need something that holds water, cause thats what hydraulic robots do,  just push water from the body to the arm.

So you can see the leg sorta looks like a snake, thats because on the 2d print its just flat straight out with the leg, but I actually bend and twist it into position afterwards with a heat gun and some extra braces im going to make to get it exactly the right angle, and thats how I get all the axi I need off a flat bar,    I'm not sure if it works that well for a humanoid, but since a spider is a pretty scary creature the extra twistiness to the legs I dont think makes it look less spidery, it sorta works for it.

It has a cool hinge configuration,  cause a) it can retract its leg closish to its body,  and b) its roll and yaw hinges interchange for each other as the pitch hinge changes angles.  so its always got access to yaw and roll,  but it swaps what hinge it uses for them depending on the prior pitch angle.


So a bit more work to go,  but I think is going to be the one!  7 years or so after I first wanted to go do this with lots of missing pieces in the jigsaw puzzle.

10 Comments | Started August 19, 2024, 03:35:32 am

LLaMA2 Meta's chatbot released in AI News

https://ai.meta.com/llama/
https://www.llama2.ai/

Quote
This release includes model weights and starting code for pretrained and fine-tuned Llama language models — ranging from 7B to 70B parameters.

Llama 2 outperforms other open source language models on many external benchmarks, including reasoning, coding, proficiency, and knowledge tests.

Scores 80.2 on the Winogrande benchmark. Benchmarks are listed in the first URL.

Very similar politician/spokes-person style to both ChatGPT and Google Bard.

8 Comments | Started July 25, 2023, 08:47:36 am

ollama and llama3 in AI News

Access to generative artificial intelligence just changed radically and for the better. Until recently our options were to use online services which were potentially very expensive and almost certainly heavily restricted, or to try to use open source models locally which required high end hardware to operate and which produced disappointing and mediocre results at best.

Last year we saw the release of ollama which made it incredibly easy to run just about any large language model locally no matter what platform you're on. You still needed a powerful system but at least you didn't have to learn a lot of obscure methods to use it.

https://ollama.com/

Last month the open source large language model llama3 was released. It has proven to be as capable as models two hundred times its size and is so efficient you can run it on a Raspberry Pi 5 if you want to, though it might take some patience.

I've been experimenting with it and it seems to be as good as any of the models that I have used online. I am running it on a Linux system with 24 cores, 64 GB of RAM, and 16 GB of video RAM. The smaller 8 billion parameter model responds to my queries almost instantly while the larger 70 billion parameter model can take a minute or two. Mostly the results produced by the smaller model are quite good enough.

5 Comments | Started July 10, 2024, 02:17:07 am

Attempting Hydraulics in Home Made Robots

I made a little home-built hydraulics system and did a writeup on it. It is NOT very capable yet, but it's a bare bones demo with my rather jury-rigged choice of parts.

https://writerofminds.blogspot.com/2023/10/poor-persons-first-hydraulic-system.html

4 Comments | Started October 10, 2023, 09:24:23 pm
Talbot, the chatbot

Talbot, the chatbot in Chatbots - English

Talbot is a free and funny chatbot, you can use it if you are bored and no friends are available to chat: he will answer you anytime and anywhere!

If you are shy, if you need some advice or just to let off steam, don't worry, Talbot is here for you: use him to chat, his humour will amaze you and will surely amuse you!
You can even ask him questions directly with your voice, thanks to the microphone button.

Understands Italian, Portuguese, English and Spanish.

 

May 15, 2023, 15:49:22 pm
I'm Your Man

I'm Your Man in Robots in Movies

Dr Alma Felser, an archaeologist, arrives at a dance club where an employee introduces her to Tom. Alma quizzes Tom on a complex math problem and on trivial details about his favourite poem, and he answers readily. Tom then invites Alma to dance but suddenly begins repeating himself; he is quickly carried away, revealing him to be a robot.

Oct 23, 2022, 22:10:45 pm
WIFELIKE

WIFELIKE in Robots in Movies

A grieving detective in the near future (Jonathan Rhys Meyers) hunts down criminals who trade artificial humans on the black market. In the fight to end AI exploitation, an underground resistance attempts to infiltrate him by sabotaging the programming of the artificial human assigned as his companion (Elena Kampouris) to behave like his late wife. She begins to question her reality as memories of a past life begin to surface in a world where nothing is as it seems.

Oct 17, 2022, 01:52:13 am
AI Love You

AI Love You in Robots in Movies

The film is set in a world where Artificial Intelligence (AI) controls most buildings. One such AI, named Dob, controls a corporate tower where a woman named Lana (Pimchanok Luevisadpaibul) works, falls in love with her after a software glitch. The AI then hijacks the body of a man, Bobby (Mario Maurer) and tries to win Lana's affections.

Oct 08, 2022, 07:16:50 am
Brian and Charles

Brian and Charles in Robots in Movies

Brian lives alone in a remote village in the countryside. Something of an outcast, he spends his spare time inventing things out of found objects in his garage. Without friends or family to rely on, Brian decides to build a robot for company. 'Charles' is not only Brian's most successful invention, but he appears to have a personality all of his own and quickly becomes Brian's best friend, curing his loneliness and opening Brian's eyes to a new way of living. However, Charles creates more problems than Brian bargained for, and the timid inventor has to face-up to several issues in his life; his eccentric ways, a local bully, and the woman he's always been fond of but never had the nerve to talk to.

Sep 30, 2022, 00:10:14 am
Space Sweepers

Space Sweepers in Robots in Movies

After snatching a crashed space shuttle in the latest debris chase, Spaceship Victory's crew members find a 7-year-old girl inside. They realise that she's the humanlike robot wanted by UTS Space Guard.

Sep 29, 2022, 20:24:53 pm
Robot Takeover: 100 Iconic Robots of Myth, Popular Culture & Real Life

Robot Takeover: 100 Iconic Robots of Myth, Popular Culture & Real Life in Books

In Robot Takeover, Ana Matronic presents 100 of the most legendary robots and what makes them iconic - their creators, purpose, design and why their existence has shaken, or in some cases, comforted us. Through 100 iconic robots - from Maria in Fritz Lang's Metropolis to the Sentinels of The Matrix and beyond, via the Gunslinger (Westworld), R2-D2 (Star Wars) etc. - this is a comprehensive look at the robot phenomenon. As well as these 100 entries on specific robots, there are features on the people who invent robots, the moral issues around robot sentience, and the prevalence of robots in music, art and fashion, and more. It's the only robot book you need.

With fighters, seducers and psychos in their ranks, it's best you get ready for the robot revolution.

Know your enemy...

Sep 26, 2022, 21:08:48 pm
The Mandalorian

The Mandalorian in Robots on TV

The Mandalorian is an American space western television series created by Jon Favreau for the streaming service Disney+. It is the first live-action series in the Star Wars franchise, beginning five years after the events of Return of the Jedi (1983), and stars Pedro Pascal as the title character, a lone bounty hunter who goes on the run to protect "the Child".

Sep 26, 2022, 20:45:35 pm
Lost in Space (2018)

Lost in Space (2018) in Robots on TV

Lost in Space is an American science fiction streaming television series following the adventures of a family of space colonists whose ship veers off course. The series is a reboot of the 1965 series of the same name, inspired by the 1812 novel The Swiss Family Robinson.

Sep 19, 2022, 01:04:37 am
Person of Interest

Person of Interest in Robots on TV

Person of Interest is an American science fiction crime drama television series that aired on CBS from September 22, 2011, to June 21, 2016.

The series centres on a mysterious, reclusive billionaire computer programmer, Harold Finch (Michael Emerson), who has developed a computer program for the federal government known as "the Machine", capable of collating all sources of information to predict terrorist acts and to identify people planning them.

The series raises an array of moral issues, from questions of privacy and "the greater good" to the concept of justifiable homicide and problems caused by working with limited information programs.

Sep 19, 2022, 00:52:30 am
Better Than Us

Better Than Us in Robots on TV

Better Than Us is a 2018 Russian science fiction television series created by Andrey Junkovsky about an advanced empathic android named Arisa.

The story takes place in 2029, in a world where androids serve humans in various positions, even replacing them in many menial jobs.

Sep 19, 2022, 00:38:20 am

top