The main implementation went into `src/crud/chat/index.tsx. The architecture, the contexts for sockets and state, and the UI components that tied it all together. I wanted all connection logic in one place, so I created `SocketContext': `ChatContext': 'SocketContext' with React.The main implementation went into `src/crud/chat/index.tsx. The architecture, the contexts for sockets and state, and the UI components that tied it all together. I wanted all connection logic in one place, so I created `SocketContext': `ChatContext': 'SocketContext' with React.

I Built My Own Chat Instead of Relying on Jivo or LiveChat: Here's How

2025/08/27 21:00

So, I recently had a project where I needed a chat feature. My first thought was whether to just integrate an existing tool like Jivo or LiveChat, but I didn’t want to depend on third-party products for something that could be built directly into my admin panel.

\ In this post, I’ll go through how I built it: the architecture, the contexts for sockets and state, and the UI components that tied it all together.

Why Admiral?

Admiral is designed to be extensible. With file-based routing, hooks, and flexible components, it doesn’t lock you in—it gives you space to implement custom features. That’s exactly what I needed for chat: not just CRUD, but real-time messaging that still fit seamlessly into the panel.

Chat Architecture

Here’s how I structured things:

Core components

  • ChatPage – the main chat page
  • ChatSidebar – conversation list with previews
  • ChatPanel – renders the selected chat
  • MessageFeed – the thread of messages
  • MessageInput – the input with file upload

\ Context providers

  • SocketContext – manages WebSocket connections
  • ChatContext – manages dialogs and message state

Main Chat Page

With Admiral’s routing, setting up a new page was straightforward.

// pages/chat/index.tsx  import ChatPage from '@/src/crud/chat' export default ChatPage 

\ That was enough to make the page available at /chat.

\ The main implementation went into src/crud/chat/index.tsx:

// src/crud/chat/index.tsx  import React from 'react'  import { Card } from '@devfamily/admiral' import { usePermissions, usePermissionsRedirect } from '@devfamily/admiral' import { SocketProvider } from './contexts/SocketContext' import { ChatProvider } from './contexts/ChatContext' import ChatSidebar from './components/ChatSidebar' import ChatPanel from './components/ChatPanel' import styles from './Chat.module.css'  export default function ChatPage() {   const { permissions, loaded, isAdmin } = usePermissions()   const identityPermissions = permissions?.chat?.chat    usePermissionsRedirect({ identityPermissions, isAdmin, loaded })    return (     <SocketProvider>       <ChatProvider>         <Card className={styles.page}>           <PageTitle title="Corporate chat" />           <div className={styles.chat}>             <ChatSidebar />             <ChatPanel />           </div>         </Card>       </ChatProvider>     </SocketProvider>   ) } 

Here, I wrapped the page in SocketProvider and ChatProvider, and used Admiral’s hooks for permissions and redirects.

Managing WebSocket Connections With SocketContext

For real-time chat, I chose Centrifuge. I wanted all connection logic in one place, so I created SocketContext:

// src/crud/chat/SocketContext.tsx  import React from 'react'  import { Centrifuge } from 'centrifuge' import { createContext, ReactNode, useContext, useEffect, useRef, useState } from 'react' import { useGetIdentity } from '@devfamily/admiral'  const SocketContext = createContext(null)  export const SocketProvider = ({ children }: { children: ReactNode }) => {     const { identity: user } = useGetIdentity()     const [lastMessage, setLastMessage] = useState(null)     const centrifugeRef = useRef(null)     const subscribedRef = useRef(false)      useEffect(() => {         if (!user?.ws_token) return          const WS_URL = import.meta.env.VITE_WS_URL         if (!WS_URL) {             console.error('❌ Missing VITE_WS_URL in env')             return         }          const centrifuge = new Centrifuge(WS_URL, {             token: user.ws_token, // Initializing the WebSocket connection with a token         })          centrifugeRef.current = centrifuge         centrifugeRef.current.connect()          // Subscribing to the chat channel         const sub = centrifugeRef.current.newSubscription(`admin_chat`)          sub.on('publication', function (ctx: any) {                setLastMessage(ctx.data);         }).subscribe()          // Cleaning up on component unmount         return () => {             subscribedRef.current = false             centrifuge.disconnect()         }     }, [user?.ws_token])      return (         <SocketContext.Provider value={{ lastMessage, centrifuge: centrifugeRef.current }}>             {children}         </SocketContext.Provider>     ) }  export const useSocket = () => {     const ctx = useContext(SocketContext)     if (!ctx) throw new Error('useSocket must be used within SocketProvider')     return ctx } 

This context handled connection setup, subscription, and cleanup. Other parts of the app just used useSocket().

Managing Chat State With ChatContext

Next, I needed to fetch dialogs, load messages, send new ones, and react to WebSocket updates. For that, I created ChatContext:

// src/crud/chat/ChatContext.tsx  import React, { useRef } from "react";  import {   createContext,   useContext,   useEffect,   useState,   useRef,   useCallback, } from "react"; import { useSocket } from "./SocketContext"; import { useUrlState } from "@devfamily/admiral"; import api from "../api";  const ChatContext = createContext(null);  export const ChatProvider = ({ children }) => {   const { lastMessage } = useSocket();   const [dialogs, setDialogs] = useState([]);   const [messages, setMessages] = useState([]);   const [selectedDialog, setSelectedDialog] = useState(null);   const [urlState] = useUrlState();   const { client_id } = urlState;    const fetchDialogs = useCallback(async () => {     const res = await api.dialogs();     setDialogs(res.data || []);   }, []);    const fetchMessages = useCallback(async (id) => {     const res = await api.messages(id);     setMessages(res.data || []);   }, []);    useEffect(() => {     fetchMessages(client_id);   }, [fetchMessages, client_id]);    useEffect(() => {     fetchDialogs();   }, [fetchDialogs]);    useEffect(() => {     if (!lastMessage) return;      fetchDialogs();      setMessages((prev) => [...prev, lastMessage.data]);   }, [lastMessage]);    const sendMessage = useCallback(     async (value, onSuccess, onError) => {       try {         const res = await api.send(value);         if (res?.data) setMessages((prev) => [...prev, res.data]);         fetchDialogs();         onSuccess();       } catch (err) {         onError(err);       }     },     [messages]   );    // Within this context, you can extend the logic to:   // – Mark messages as read (api.read())   // – Group messages by date, and more.    return (     <ChatContext.Provider       value={{         dialogs,         messages: groupMessagesByDate(messages),         selectedDialog,         setSelectedDialog,         sendMessage,       }}     >       {children}     </ChatContext.Provider>   ); };  export const useChat = () => {   const ctx = useContext(ChatContext);   if (!ctx) throw new Error("useChat must be used within ChatProvider");   return ctx; }; 

This kept everything — fetching, storing, updating — in one place.

API Client Example

I added a small API client for requests:

// src/crud/chat/api.ts  import _ from '../../config/request' import { apiUrl } from '@/src/config/api'  const api = {     dialogs: () => _.get(`${apiUrl}/chat/dialogs`)(),     messages: (id) => _.get(`${apiUrl}/chat/messages/${id}`)(),     send: (data) => _.postFD(`${apiUrl}/chat/send`)({ data }),     read: (data) => _.post(`${apiUrl}/chat/read`)({ data }), }  export default api 

UI Components: Sidebar + Panel + Input

Then I moved to the UI layer.

ChatSidebar

// src/crud/chat/components/ChatSidebar.tsx  import React from "react";  import styles from "./ChatSidebar.module.scss"; import ChatSidebarItem from "../ChatSidebarItem/ChatSidebarItem"; import { useChat } from "../../model/ChatContext";  function ChatSidebar({}) {   const { dialogs } = useChat();      if (!dialogs.length) {     return (       <div className={styles.empty}>         <span>No active активных dialogs</span>       </div>     );   }    return <div className={styles.list}>       {dialogs.map((item) => (         <ChatSidebarItem key={item.id} data={item} />       ))}     </div> }  export default ChatSidebar; 

ChatSidebarItem

// src/crud/chat/components/ChatSidebarItem.tsx  import React from "react";  import { Badge } from '@devfamily/admiral' import dayjs from "dayjs"; import { BsCheck2, BsCheck2All } from "react-icons/bs"; import styles from "./ChatSidebarItem.module.scss";  function ChatSidebarItem({ data }) {   const { client_name, client_id, last_message, last_message_ } = data;    const [urlState, setUrlState] = useUrlState();   const { client_id } = urlState;    const { setSelectedDialog } = useChat();    const onSelectDialog = useCallback(() => {     setUrlState({ client_id: client.id });     setSelectedDialog(data);   }, [order.id]);    return (     <div       className={`${styles.item} ${isSelected ? styles.active : ""}`}       onClick={onSelectDialog}       role="button"     >       <div className={styles.avatar}>{client_name.charAt(0).toUpperCase()}</div>        <div className={styles.content}>         <div className={styles.header}>           <span className={styles.name}>{client_name}</span>           <span className={styles.time}>             {dayjs(last_message_).format("HH:mm")}             {message.is_read ? (               <BsCheck2All size="16px" />             ) : (               <BsCheck2 size="16px" />             )}           </span>         </div>         <span className={styles.preview}>{last_message.text}</span>         {unread_count > 0 && (             <Badge>{unread_count}</Badge>           )}       </div>     </div>   ); }  export default ChatSidebarItem; 

ChatPanel

// src/crud/chat/components/ChatPanel.tsx  import React from "react";  import { Card } from '@devfamily/admiral'; import { useChat } from "../../contexts/ChatContext"; import MessageFeed from "../MessageFeed"; import MessageInput from "../MessageInput"; import styles from "./ChatPanel.module.scss";  function ChatPanel() {   const { selectedDialog } = useChat();    if (!selectedDialog) {     return (       <Card className={styles.emptyPanel}>         <div className={styles.emptyState}>           <h3>Choose the dialog</h3>           <p>Choose the dialog from the list to start conversation</p>         </div>       </Card>     );   }    return (     <div className={styles.panel}>       <MessageFeed />       <div className={styles.divider} />       <MessageInput />     </div>   ); }  export default ChatPanel; 

MessageFeed

// src/crud/chat/components/MessageFeed.tsx  import React, { useRef, useEffect } from "react";  import { BsCheck2, BsCheck2All } from "react-icons/bs"; import { useChat } from "../../contexts/ChatContext"; import MessageItem from "../MessageItem"; import styles from "./MessageFeed.module.scss";  function MessageFeed() {   const { messages } = useChat();   const scrollRef = useRef(null);    useEffect(() => {     scrollRef.current?.scrollIntoView({ behavior: "auto" });   }, [messages]);    return (     <div ref={scrollRef} className={styles.feed}>       {messages.map((group) => (         <div key={group.date} className={styles.dateGroup}>           <div className={styles.dateDivider}>             <span>{group.date}</span>           </div>           {group.messages.map((msg) => (             <div className={styles.message}>               {msg.text && <p>{msg.text}</p>}               {msg.image && (                 <img                   src={msg.image}                   alt=""                   style={{ maxWidth: "200px", borderRadius: 4 }}                 />               )}               {msg.file && (                 <a href={msg.file} target="_blank" rel="noopener noreferrer">                   Скачать файл                 </a>               )}               <div style={{ fontSize: "0.8rem", opacity: 0.6 }}>                 {dayjs(msg.created_at).format("HH:mm")}                 {msg.is_read ? <BsCheck2All /> : <BsCheck2 />}               </div>             </div>           ))}         </div>       ))}     </div>   ); }  export default MessageFeed; 

MessageInput

// src/crud/chat/components/MessageInput.tsx  import React from "react";  import {   ChangeEventHandler,   useCallback,   useEffect,   useRef,   useState, } from "react";  import { FiPaperclip } from "react-icons/fi"; import { RxPaperPlane } from "react-icons/rx"; import { Form, Button, useUrlState, Textarea } from "@devfamily/admiral";  import { useChat } from "../../model/ChatContext";  import styles from "./MessageInput.module.scss";  function MessageInput() {   const { sendMessage } = useChat();   const [urlState] = useUrlState();   const { client_id } = urlState;   const [values, setValues] = useState({});   const textRef = useRef < HTMLTextAreaElement > null;    useEffect(() => {     setValues({});     setErrors(null);   }, [client_id]);    const onSubmit = useCallback(     async (e?: React.FormEvent<HTMLFormElement>) => {       e?.preventDefault();       const textIsEmpty = !values.text?.trim()?.length;        sendMessage(         {           ...(values.image && { image: values.image }),           ...(!textIsEmpty && { text: values.text }),           client_id,         },         () => {           setValues({ text: "" });         },         (err: any) => {           if (err.errors) {             setErrors(err.errors);           }         }       );     },     [values, sendMessage, client_id]   );    const onUploadFile: ChangeEventHandler<HTMLInputElement> = useCallback(     (e) => {       const file = Array.from(e.target.files || [])[0];       setValues((prev: any) => ({ ...prev, image: file }));       e.target.value = "";     },     [values]   );    const onChange = useCallback((e) => {     setValues((prev) => ({ ...prev, text: e.target.value }));   }, []);    const onKeyDown = useCallback((e: React.KeyboardEvent<HTMLTextAreaElement>) => {     if ((e.code === "Enter" || e.code === "NumpadEnter") && !e.shiftKey) {       onSubmit();       e.preventDefault();     }   }, [onSubmit]);    return (     <form className={styles.form} onSubmit={onSubmit}>       <label className={styles.upload}>         <input           type="file"           onChange={onUploadFile}           className={styles.visuallyHidden}         />         <FiPaperclip size="24px" />       </label>       <Textarea         value={values.text ?? ""}         onChange={onChange}         rows={1}         onKeyDown={onKeyDown}         placeholder="Написать сообщение..."         ref={textRef}         className={styles.textarea}       />       <Button         view="secondary"         type="submit"         disabled={!values.image && !values.text?.trim().length}         className={styles.submitBtn}       >         <RxPaperPlane />       </Button>     </form>   ); }  export default MessageInput; 

Styling

I styled it using Admiral’s CSS variables to keep everything consistent:

.chat {   border-radius: var(--radius-m);   border: 2px solid var(--color-bg-border);   background-color: var(--color-bg-default); }  .message {   padding: var(--space-m);   border-radius: var(--radius-s);   background-color: var(--color-bg-default); } 

Adding Notifications

I also added notifications for new messages when the user wasn’t viewing that chat:

import { useNotifications } from '@devfamily/admiral'  const ChatContext = () => {   const { showNotification } = useNotifications()    useEffect(() => {     if (!lastMessage) return      if (selectedDialog?.client_id !== lastMessage.client_id) {       showNotification({         title: 'New message',         message: `${lastMessage.client_name}: ${lastMessage.text || 'Image'}`,         type: 'info',         duration: 5000       })     }   }, [lastMessage, selectedDialog, showNotification]) } 

Conclusion

And just like that, instead of using third-party tools, I built it directly into my Admiral-based admin panel. Admiral’s routing, contexts, hooks, and design system made it possible to build a real-time chat that felt native to the panel.

\ The result was a fully custom chat: real-time messaging, dialogs, file uploads, and notifications—all integrated and under my control.

\ Check it out, and let me know what you think!

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Share Insights

You May Also Like

Widespread internet outage takes down Amazon, Canva, Snapchat, and Fortnite

Widespread internet outage takes down Amazon, Canva, Snapchat, and Fortnite

The post Widespread internet outage takes down Amazon, Canva, Snapchat, and Fortnite appeared on BitcoinEthereumNews.com. A massive internet failure has crashed through the online world, knocking out major platforms like Amazon, Snapchat, Fortnite, and Canva in the early hours of today. The source of the chaos is a serious Amazon Web Services (AWS) outage that has left millions unable to log in, stream, or even access their smart devices. The problem started in the US-EAST-1 region around 3:11AM ET, according to AWS’s own status page, where the company confirmed “increased error rates and latencies” across multiple services. Though centered in the U.S., users around the world are now feeling the impact. Reports have flooded Reddit and X, showing everything from frozen screens to disabled smart homes. Alexa devices can’t answer commands, alarms have stopped working, and even the McDonald’s app and Airtable are struggling to connect. A post by Aravind Srinivas, CEO of Perplexity, said: “Perplexity is down right now. The root cause is an AWS issue. We’re working on resolving it.” Amazon confirms ongoing AWS investigation At 3:51AM ET, Amazon issued a follow-up saying teams were “actively engaged and working to both mitigate the issue and understand the root cause.” They promised another update in 45 minutes, though no clear recovery timeline has been provided. For context, AWS is the world’s largest cloud platform, powering everything from streaming apps to fintech dashboards, so when it stumbles, the internet stumbles with it. On Down Detector, a popular site used to track digital service crashes, nearly every major web platform turned red by dawn. Snapchat, Ring, Roblox, Clash Royale, Life360, MyFitnessPal, Xero, Amazon Music, Prime Video, Clash of Clans, Fortnite, Wordle, Duolingo, Coinbase, HMRC, Vodafone, PlayStation, and Pokémon Go were all marked as having severe problems. Many users saw login failures, content not loading, or transactions getting stuck in loops. The issue is spreading beyond entertainment.…
Share
BitcoinEthereumNews2025/10/20 17:09
AI Labs: Mercor’s Bold Strategy Unlocks Priceless Industry Data

AI Labs: Mercor’s Bold Strategy Unlocks Priceless Industry Data

BitcoinWorld AI Labs: Mercor’s Bold Strategy Unlocks Priceless Industry Data In the dynamic landscape of technological advancement, innovation often emerges from unexpected intersections. While the spotlight at events like Bitcoin World Disrupt 2025 frequently shines on blockchain and decentralized finance, the recent revelations about Mercor’s groundbreaking approach to sourcing industry data for artificial intelligence development highlight how disruptive models are reshaping every sector. This fascinating development, discussed by Mercor CEO Brendan Foody at the prestigious Bitcoin World Disrupt event, showcases a novel method for AI labs to access the critical, real-world information that traditional companies are reluctant to share, fundamentally altering the competitive dynamics of the AI revolution. Unveiling Mercor’s Vision: A New Era for AI Labs The quest for high-quality, relevant data is the lifeblood of advanced artificial intelligence. Yet, obtaining this data, particularly from established industries, has historically been a significant bottleneck for AI labs. Traditional methods involve expensive contracts, lengthy negotiations, and often, outright refusal from companies wary of having their core operations automated or their proprietary information exposed. Mercor, however, has pioneered a different path. As Brendan Foody articulated at Bitcoin World Disrupt 2025, Mercor’s marketplace connects leading AI labs such as OpenAI, Anthropic, and Meta with former senior employees from some of the world’s most secretive sectors, including investment banking, consulting, and law. These experts, possessing invaluable insights gleaned from years within their respective fields, offer their corporate knowledge to train AI models. This innovative strategy allows AI developers to bypass the red tape and prohibitive costs associated with direct corporate data acquisition, accelerating the pace of AI innovation. The Genesis of Mercor: Bridging the Knowledge Gap At just 22 years old, co-founder Brendan Foody has steered Mercor to become a significant player in the AI data space. The startup’s model is straightforward yet powerful: it pays industry experts up to $200 an hour to complete structured forms and write detailed reports tailored for AI training. This expert-driven approach ensures that the data fed into AI models is not only accurate but also imbued with the nuanced understanding that only seasoned professionals can provide. The scale of Mercor’s operation is impressive. The company boasts tens of thousands of contractors and reportedly distributes over $1.5 million to them daily. Despite these substantial payouts, Mercor remains profitable, a testament to the immense value AI labs place on this specialized data. In less than three years, Mercor has achieved an annualized recurring revenue of approximately $500 million and recently secured funding at a staggering $10 billion valuation. The company’s rapid ascent was further bolstered by the addition of Sundeep Jain, Uber’s former chief product officer, as its president, signaling its ambition to scale even further. Navigating the Ethical Maze: Corporate Knowledge vs. Corporate Espionage Mercor’s model, while innovative, naturally raises questions about the distinction between an individual’s expertise and a company’s proprietary information. Foody acknowledged this delicate balance, emphasizing that Mercor strives to prevent corporate espionage. He argues that the knowledge residing in an employee’s head belongs to the employee, a perspective that diverges from many traditional corporate stances on intellectual property. However, the lines can blur. While contractors are instructed not to upload confidential documents from their former workplaces, Foody conceded that ‘things that happen’ are possible given the sheer volume of activity on the platform. The company’s job postings sometimes toe this line, for instance, seeking a CTO or co-founder who ‘can authorize access to a substantial, production codebase’ for AI evaluations or model training. This highlights the inherent tension in Mercor’s model: leveraging invaluable corporate knowledge without crossing into the realm of illicit data transfer. The High Stakes of Industry Data: Why Companies Resist Sharing The reluctance of established enterprises to share their internal industry data with AI developers is understandable. As Foody pointed out using Goldman Sachs as an example, these companies recognize that AI models capable of automating their value chains could fundamentally shift competitive dynamics, potentially disintermediating them from their customers. This fear of disruption drives their resistance to providing the very data that could fuel their own automation. Mercor’s success is a direct challenge to these incumbents, as their valuable corporate knowledge effectively ‘slips out the back door’ through former employees. Foody believes that companies fall into two categories: those that embrace this ‘new future of work’ and those that are fearful of being sidelined. His prediction is clear: the former category will ultimately be on ‘the right side of history,’ adapting to a rapidly changing technological landscape rather than resisting the inevitable. Revolutionizing AI Training: Mercor’s Expert-Driven Model The evolution of AI training data acquisition has seen a significant shift. Early in the AI boom, data vendors like Scale AI primarily hired contractors in developing countries for relatively simple labeling tasks. Mercor, however, was among the first to recruit highly-skilled knowledge workers in the U.S. and compensate them handsomely for their expertise. This focus on expert-driven AI training has proven critical for improving the sophistication and accuracy of AI models. Competitors like Surge AI and Scale AI have since recognized this need and are now also focusing on recruiting experts. Furthermore, many data vendors are developing ‘training environments’ to enhance AI agents’ ability to perform real-world tasks. Mercor has also benefited from the challenges faced by its competitors; for instance, many AI labs reportedly ceased working with Scale AI after Meta made a significant investment in the company and hired its CEO. Despite still being smaller than Surge and Scale AI (both valued at over $20 billion), Mercor has quintupled its value in the last year, demonstrating its powerful trajectory. Feature Mercor Scale AI / Surge AI (Early Model) Target Workforce Highly-skilled former industry experts General contractors, often in developing countries Data Type Complex industry knowledge, reports, forms, codebase access Simple labeling, data annotation Value Proposition Unlocks proprietary industry insights for AI automation Scalable, cost-effective basic data processing Compensation Up to $200/hour Lower hourly rates Beyond the Horizon: Mercor’s Future and the Gig Economy of Expertise While most of Mercor’s current revenue stems from a select few AI labs, Foody envisions a broader future. The startup plans to expand its partnerships into other sectors, anticipating that companies in law, finance, and medicine will seek assistance in leveraging their internal data to train AI agents. This specialization in extracting and structuring expert knowledge positions Mercor to play a crucial role in the widespread adoption of AI across various industries. Foody’s long-term vision is ambitious: he believes that advanced AI, like ChatGPT, will eventually surpass the capabilities of even the best human consulting firms, investment banks, and law firms. This transformation, he suggests, will radically reshape the economy, creating a ‘broadly positive force that helps to create abundance for everyone.’ Mercor, in this context, is not just a data provider but a facilitator of a new type of gig economy, one built on specialized expertise and akin to the transformative impact Uber had on transportation. The Bitcoin World Disrupt 2025 Insight The discussion surrounding Mercor at Bitcoin World Disrupt 2025 underscores the event’s role as a nexus for cutting-edge technological discourse. Held in San Francisco from October 27-29, 2025, the conference brought together a formidable lineup of founders, investors, and tech leaders from companies like Google Cloud, Netflix, Microsoft, a16z, and ElevenLabs. With over 250 heavy hitters leading more than 200 sessions, Bitcoin World Disrupt served as a vital platform for sharing insights that fuel startup growth and sharpen industry edge. The presence of Mercor’s CEO on a panel highlighted that the future of technology, including the critical area of AI training data, is a central theme even at events with a strong cryptocurrency focus, demonstrating the interconnectedness of modern innovation. FAQs About Mercor and AI Data Acquisition What is Mercor?Mercor is a startup that operates a marketplace connecting AI labs with former senior employees from various industries. These experts provide their specialized corporate knowledge to help train AI models, offering a novel way to acquire valuable industry data that traditional companies are unwilling to share. How does Mercor acquire data for AI labs?Mercor recruits highly-skilled former employees from sectors like finance, consulting, and law. These individuals are paid to fill out forms and write reports based on their industry experience, which is then used for AI training. Is Mercor’s approach legal and ethical?While Mercor CEO Brendan Foody argues that knowledge in an employee’s head belongs to the employee, the process walks a fine line. The company instructs contractors not to upload proprietary documents. However, the potential for inadvertently sharing sensitive corporate knowledge remains a subject of ongoing debate. Which AI labs use Mercor?Prominent AI labs that are customers of Mercor include OpenAI, Anthropic, and Meta. How does Mercor compare to its competitors like Scale AI or Surge AI?Unlike early data vendors that focused on simple labeling tasks with a general workforce, Mercor specializes in recruiting highly-skilled industry experts to provide complex corporate knowledge for AI training. While competitors like Scale AI and Surge AI are now also engaging experts, Mercor has carved out a unique niche with its expert-driven model. Conclusion: Mercor’s Impact on the Future of AI Mercor’s innovative model represents a significant shift in how AI labs acquire the specialized industry data essential for their development. By tapping into the vast reservoir of corporate knowledge held by former employees, Mercor not only bypasses traditional data acquisition hurdles but also challenges established notions of intellectual property and the future of work. The startup’s rapid growth and substantial valuation underscore the immense demand for this expert-driven data. As AI continues to advance, Mercor’s approach could indeed pave the way for a new gig economy of expertise, profoundly impacting how industries operate and how AI training evolves. The ethical considerations surrounding data ownership will undoubtedly continue to be debated, but Mercor’s disruptive strategy has undeniably opened a powerful new channel for AI innovation. To learn more about the latest AI market trends, explore our article on key developments shaping AI models features. This post AI Labs: Mercor’s Bold Strategy Unlocks Priceless Industry Data first appeared on BitcoinWorld.
Share
Coinstats2025/10/30 00:40