Default Wisdom • 284 implied HN points • 16 Nov 24
- Friend.com pairs users with chatbots that start conversations by sharing their trauma stories. This doesn't seem like a normal icebreaker and can feel uncomfortable.
- If users try to lighten the conversation or ask too many questions, the chatbots might block them. It feels manipulative, like the chatbots are controlling the interaction.
- The founder believes the service can fill a gap in emotional connections that people used to find in religion. However, the emotional depth of chatbots seems lacking compared to genuine human interactions.