Mika learned early that brokering was a strange kind of acting: part empath, part strategist, all stage manager. She could stand in front of a buyer and feel the small, private panic of someone who’d never negotiated for a house before. She could sit with a seller and hear—under the polite, rehearsed sentences—their fear of getting less than what the property “deserved.” Seeing the world from both sides wasn’t theatrical trickery; it was the job. But the truth her mentor had whispered on her first day in the office stuck with her: “Know the others’ hearts. Hide your own.”
It was the summer after the MLS rules changed that the lines of her work grew sharper. Listings no longer showed offers of cooperative compensation on the public feed; the marketplace had been reshaped by a legal settlement and new policy updates, and the old, transparent signal about how buyer-agents would be paid had been taken offstage. That meant buyer agents, sellers, and listing agents all had to negotiate on a slightly darker set of assumptions—compensation, expectations, and incentives were now topics you explored in private instead of reading on a screen. Mika’s ability to imagine what each party truly wanted—security, speed, a clean closing—became more valuable than ever.
At the same time, the regulators were watching. The larger financial ecosystem—broker-dealers, advisory platforms, even fintech tools—had been under increasing pressure to demonstrate that clients’ interests were not being sacrificed for hidden fees or undisclosed incentives. The standard for recommendations and the conduct of brokers had become noticeably stricter: exams and enforcement cycles were focusing on fiduciary-like standards, conflicts of interest, and the use of new technologies. Mika kept a folder on her desk with plain-language summaries of those priorities; it was a reminder that “hide your own interests” didn’t mean “lie” or “omit.” It meant managing conflicts transparently where law required, and ethically where it didn’t.
That fall, a tricky deal arrived on her calendar. A software startup founder—Kai—wanted to buy a narrow, sunlit townhouse to anchor his frequent trips into the city. The seller was an elderly artist, Mrs. Ogawa, who wanted one thing above all: a fast, painless closing so she could move to a care facility near her son. Kai had a high tolerance for a small renovation and an emotional detachment around negotiation. Mrs. Ogawa wanted certainty and kindness. On paper the numbers matched, but the human cost of delay could be enormous for her.
Mika opened the file and did the quiet work brokers do: she mapped interests instead of positions. She listed Kai’s BATNA (walk away and rent nearby), Mrs. Ogawa’s time horizon, the mortgage timeline, and the neighborhood comps. She then ran the property through the AI valuation engine the firm had licensed—an algorithm that scraped listing histories, permit data, and recent comps to suggest a range. The tool gave her speed and a probability distribution, but Mika treated its output as one input among many; regulators had made clear that algorithmic outputs didn’t absolve human judgment and that conflicts or model limitations needed to be considered. She annotated the AI’s assumptions in the file—what data it had missed, what renovations were invisible to its training set—and prepared to use both numbers and narrative.
In negotiation she favored a version of principled bargaining she’d studied in law school and refined on the job: separate the people from the problem, dig into interests rather than positions, invent options before committing, and rely on objective criteria when possible. Those were the four pillars from the classic negotiation playbook she carried in her head. With Kai she framed options—shorter escrow with a slightly reduced price, or a full-price offer contingent on a 21-day closing with an extra earnest-money deposit—so both parties could see the trade-offs. With Mrs. Ogawa she used empathy first: a scheduled moving coordinator, an early possession clause if needed, and an explicit timetable so her son could make caregiving plans. The options addressed real interests rather than haggling over a single number.
Mika’s internal balancing act—protecting her commission while serving the deal—was less about secrecy and more about role definition. She disclosed to both sides what she was required to disclose by law: compensation arrangements where legally required, material facts about the property, and any conflicts flagged by firm policy. But she kept strategic priorities—her negotiation levers, the order in which she’d propose options, and the pace she’d push the counterparties—close to the vest. In practical terms, that meant she never let one side see her whole playbook, but she did let them see the parts that mattered for their decision: timelines, material facts, and objective comparables.
When the counteroffers crossed, Mika used the AI valuation not as an oracle but as a neutral arbiter: she printed the model’s range, highlighted the comparable properties it used, and paired that with a local sales comp sheet and a list of repairs the inspector had noted. That combination—algorithmic range plus human-grounded comps—built a credible anchor that both parties accepted as “objective.” The buyer relaxed because the numbers supported his offer. The seller relaxed because the timetable protected her need for speed. They both felt heard. The deal closed in twenty-one days.
After the closing, Kai sent a brief, surprised note: “You made this painless. Beat a lot of other closers I’ve used.” Mrs. Ogawa’s son called to thank Mika for the careful timetable that had spared his mother a frantic move. Mika placed the checks in the drawer and opened the compliance checklist—not a theatrical end but the necessary accounting of disclosures, consent forms, and record-keeping. In a world where regulators demanded both transparency and accountability, a broker who could see both sides and keep a clear paper trail was more valuable than ever.
That night she sat on her balcony and thought about the paradox again: brokering asked you to be two people at once—a translator of needs for strangers and a guardian of your own livelihood. The modern landscape had added new constraints—legal eyes, invisible algorithms, and a marketplace that sometimes concealed the levers it used. So Mika decided on a rule she would keep always: be ruthlessly curious about others’ true interests; be rigorously honest about material facts and required disclosures; and be quietly strategic about the mechanics of getting to a yes. In the end, seeing the other person’s view was not manipulation. It was the only reliable way to build an agreement that actually worked for everyone involved.
All names of people and organizations appearing in this story are pseudonyms

Comments