Original Content podcast: Netflix’s Taylor Swift documentary feels like a guarded self-portrait

“Miss Americana,” a new Netflix documentary about Taylor Swift, is worth watching — if you go in with the right expectations.

At least, that’s according to two out of three hosts of the Original Content podcast. Darrell was the holdout; he didn’t hate the movie or think it was poorly made, but he’s much more skeptical about celebrity culture in general and argues that everyone would be better off ignoring celebrities altogether.

Your other hosts don’t go quite that far. Instead, we admit to a guarded admiration for Swift and her music, and we enjoyed “Miss Americana” as a window into Swift’s world. Not a completely transparent window — despite being directed by Lana Wilson, the film feels like it was guided by Swift’s perspective, focusing on her chosen themes of tabloid persecution and political awakening — but a revealing one nevertheless.

What comes across clearly is the utter insanity of the musician’s life, lived under intense (and often unfair) media scrutiny.

The film also demonstrates the extraordinary talent, ambition and luck that Swift must have needed to get where she is. And it boasts a few glimpses into her songwriting and recording process, and into what appears to have been an agonizing decision to endorse Democrat Phil Bredesen’s ultimately unsuccessful run for one of Tennessee’s Senate seats in 2018.

In addition to reviewing the film, we also discuss Netflix’s decision to make auto-play previews optional.

You can listen in the player below, subscribe using Apple Podcasts or find us in your podcast player of choice. If you like the show, please let us know by leaving a review on Apple. You can also send us feedback directly. (Or suggest shows and movies for us to review!)

And if you’d like to skip ahead, here’s how the episode breaks down:

0:00 Intro
0:28 Netflix auto-play discussion
5:02 “Miss Americana” review

More cash-crunched companies turn to convertible notes

Convertible notes are not just for early stage startups any more.

These promissory notes, which are structured as debt that convert into equity upon a specific event like a certain date or the closing of a priced investment round, are increasingly being adopted by established companies that have already raised millions of dollars in venture capital.

In the past, these financial instruments have been the province of founders that weren’t sure how to value their companies. If they agreed to sell a fixed percentage of their startup when they didn’t have a lot of customer traction, they might be giving up a lot of upside in their company. Convertible notes allow companies more time to develop their businesses before deciding who gets what.

In recent months, however, more established companies that have already raised priced rounds have raised money via convertible notes. According to the WSJ, Juul Labs, the e-cigarette maker, recently raised more than $700 million in convertible debt to fund its operations. NeueHouse. a venture-backed nine-year-old, New York-based company that provides workspace to creatives is in the process of raising $15 million in convertible debt, shows an SEC filing. And a crytpo exchange that has raised several rounds of venture funding, LedgerX, just closed on $3.8 million in debt.

Why would these older companies want to steal a page from the early-stage startup handbook? Because they want to avoid the negative blowback that might result from a “down round,” or a round that establishes a valuation lower than the previous valuation.

VCs don’t like down rounds as it means “writing down” the holdings in their financial statements to their own investors. Down rounds can also publicly signal that a startup’s growth is slowing, or hammer home the fact that investors overestimated how much their original stake was worth.

Convertible notes are a convenient band-aid, giving companies a little breathing room to fix their products, search for possible buyers, or move into a different space if what’s plaguing them applies more to their industry than their specific products or services.

From all outward appearances, the e-cigarette maker Juul definitely needs some time to get its ducks in a row. The company is under growing regulatory and financial pressure from investigations into its marketing practices and mounting lawsuits by a growing number of school districts and counties (among them, the Ceres Unified School District, outside of Modesto; the Monterey Peninsula Unified School District; and Bucks County, Pa.)

While Juul once seemed like such a sure bet that Altria purchased a 35% stake in the company for $12.8 billion, Juul’s future is now a lot less clear. On January 30th, Altria marked down the value of its stake by $4.1 billion, and it now values the entire company at just $12 billion, a 68% decline from the original valuation that Altria ascribed to Juul when it invested.

According to the WSJ, the $700 million will only convert to equity if Juul’s next round values the company between $10 billion and $25 billion. If the valuation of the next round is lower than $10 billion or higher than $25 billion, the $700 million will be treated as debt.

The lower valuation protects investors from holding mere shares in a company whose value is plummeting; in this case, it would be much better to have control of a solid company asset (think the receivables, computers, and real estate that secures the debt). Conversely, the higher cap gives investors the assurance that their equity will not be unduly diluted by an unrealistically high valuation from a new investor.

If all goes well, investors benefit because their money converts into equity at a discount to the price paid by the next investor. It’s more exciting than receiving just a straight percentage on their money, which they would earn with traditional debt.

And in some sense, the use of convertible notes by later-stage companies can be viewed as a positive sign. After all, if these companies were irretrievably broken, they wouldn’t be able to raise any money at all.

Still, it’s a desperate look for Juul, in particular, which in 2018 was considered to be among the most highly valued private companies in the world.

It suggests that the company’s future is so unpredictable that it can no longer satisfy the requirements or traditional lenders, which insist that borrowers meet a broad array of financial milestones, like hitting revenue targets and maintaining minimum levels of cash on hand.

The team behind Apple’s ‘Mythic Quest’ says video games aren’t the punch line

When Ubisoft first approached “It’s Always Sunny in Philadelphia” stars Rob McElhenney and Charlie Day about creating a new show set in the video game industry, McElhenney said they weren’t interested — at least, not initially.

“Anything that we had ever seen in the past, from a movie or television show perspective, the industry was always presented in such a negative light,” he told me. “It was the butt of the joke. The characters themselves were derided, and it was very specific to geek culture … We just had no interest in that.”

And yet McElhenney, Day and “It’s Always Sunny” writer Megan Ganz ended up creating “Mythic Quest: Raven’s Banquet,” which premieres on Apple TV+ this weekend. McElhenney explained that a visit to the Montreal offices of Ubisoft — publisher of “Assassin’s Creed”, “Prince of Persia” and other major game franchises — changed his mind.

“Once we went to Montreal and met all of the devs that worked at Ubisoft, that all work in communion to make these games, [we realized] how many different, disparate personalities there really were and how much they were all all united by their love of games,” he said.

So McElhenney decided that “this just seemed like a really interesting and new place to set those kinds of stories.”  And just as he assumes most “Sunny” viewers aren’t tuning in to learn the fate of Paddy’s Pub (the Philadelphia bar run by the show’s main characters), “The approach we took was, the general audience is not going to care about the success or failure of a video game, they’re going to care about the interpersonal dynamics of the characters themselves.”

Ganz also said she didn’t know much about video game development when McElhenney first approached her about collaborating on the show, but she started to see parallels between that world and a TV writers’ room.

“Except that instead of everyone being a writer, they all have very specialized jobs that they care about, like just the writing or just the design or just the money that’s being made,” she said. “And I thought, well, that’s really fun because that presents something that’s even more complex than your typical writers’ room — you have all these sort of Greek gods that all control their very specific part of the world.”

Mythic Quest

Of course, “Mythic Quest” had a writers’ room of its own, which Ganz said was divided evenly between people with deep knowledge of the industry (like Ashly Burch, who’s done extensive voiceover work on games like “Team Fortress 2” and “Fortnite,” and who also plays a game tester on the show), and those like Ganz herself, “who maybe played casually when they were younger” but ultimately didn’t know much about that world.

“We did that because ultimately, if you come up with a script or a joke that satisfies both of those people, then you’re going to satisfy as much of the audience as you possibly can,” she said.

The goal, she added, was not “pandering to the video game community,” but rather “to be authentic and not make fun of them, but also be authentic in terms of talking about some of the toxicity that happens in the video game space, the gender dynamics that are at play.”

It wasn’t just a learning process for the writers. F. Murray Abraham (who won an Oscar for playing Salieri in “Amadeus”) plays an eccentric science fiction writer who works on the game, and he told me that when it came to video games, “I had no idea. I knew something, I was aware of it, but not the size of it, the success of it, the reach of it, my God.”

All the “Mythic Quest” writers and actors I spoke to said that their approach has evolved significantly from the original pilot script. For example, there’s McElhenney’s character Ian Grimm, the creative director of the massively multiplayer online roleplaying game that gives the show its name.

“In the first draft of the script, we made Ian a little bit more of just a straight buffoon,” McElhenney said. “We read through it and we realized it just felt false. It was missing something, that if we didn’t want this to feel like a live action cartoon — like ‘Sunny’ often does, which is by design — and we wanted these people to feel real and authentic, that we needed to believe that he really should have that position.”

The question, then was how to make him competent, but in a funny way. They went with a pilot episode where Ian and lead engineer Poppy (played by Charlotte Nicdao) end up in a passionate debate about the properties of the game’s brand new shovel. While that debate will probably seem silly to most viewers, McElhenney said it also conveys “that thing that so many people in the creative arts have, or don’t have — the ability to see the most minor detail, the reason why something is going to work, or why it might not work.”

Mythic Quest

Throughout that process, the writers also tapped Ubisoft for advice. Jason Altman, Ubisoft’s head of film and television, is an executive producer on the show, and he recalled bringing in different team members to help the writers understand everything that goes into the development process.

In addition, Ubisoft Red Storm (the studio behind the Tom Clancy game franchise) pitched in by building the game segments that we actually see on the show.

“What they created were actually small gameplay sandboxes that we could bring to set, and the actors could sit and play with them and it would actually inform their performances,” Altman said.

He acknowledged that there were challenges, like helping the “Mythic Quest” writers realize that the developers needed time to do their work — but ultimately, he said the Red Storm team had “a great time” creating something that gave the show “a real sense of authenticity.”

Ganz and McElhenney also had plenty of praise for the developers, particularly for their openness to adding silly comedic elements like ridiculous gouts of blood. McElhenney pointed to one episode that required them to create “a really believable Sieg Heil Nazi salute.”

“There’s no way they’re going to go for that, it’s going to take a follow-up phone call,” he recalled thinking. “And they were like, ‘Okay great.’ And I was like, ‘Wait, what do you mean, okay great?’ They said, ‘No, we do Nazis all the time’ — and we put this in the show — ‘because Nazis make the best villains, everybody hates Nazis.”

I was also curious about why the show focuses on the development of an ongoing MMORPG, rather than launching a new game. Altman had an answer for me: “I think it represents what’s happening within the game industry. You don’t just launch a game and forget it, the development team lives with it, you’ve got live services and live events. It’s the way games are operated right now.”

Plus, he said it reflects another aspect of development, the fact that teams “don’t just spend six months together, they spend years together, and the success that they create together binds them together.”

David Hornsby — who, like McElhenney, is both a writer, executive producer and actor on the show — told me that the writers’ understanding of show’s distribution also evolved, since Apple TV+ hadn’t launched (or even been officially announced) when “Mythic Quest” first got picked up.

“We weren’t sure if it wasn’t going to be binge-able from the start, we heard incrementally,” Hornsby said. “Apple is good at keeping secrets.”

Ultimately, they did find out that all nine episodes would drop at once, which Hornsby said led them to structure the season “like a movie — we know where we are going to be in the middle of the season, the story arcs for each of our characters.”

I also brought up Apple TV+ with McElhenney, who said the team had offers from a number of studios.

“It was scary,” he said. “And I remember we were discussing it, we were like, do we go with a known quantity? Or do we jump into the waters of mystery, because even though it’s the biggest company in the world, you don’t know if it’s going to work.”

So why choose Apple? “We just felt like, if you’re gonna bet on somebody, why not bet on a trillion dollars? They seem to have the resources and something figured out.”

As top exhibitors pull out of MWC, organizers implement stringent safeguards

A couple of weeks out, Mobile World Congress organizer, the GSMA, has issued some fairly sweeping safeguards over growing concerns around the coronavirus. After a number of high profile back outs, including ZTE, LG, NVIDIA and Ericsson, the company issued a new list, including a ban of visitors originating from the Hubei province, whose capital Wuhan is believed to be the origin of the epidemic.

Per GSMA CEO John Hoffman,

  • All travelers from the Hubei province will not be permitted access to the event

  • All travelers who have been in China will need to demonstrate proof they have been outside of China 14 days prior to the event (passport stamp, health certificate)

  • Temperature screening will be implemented

  • Attendees will need to self-certify they have not been in contact with anyone infected.

More than 800 people have died from the virus, surpassing the 774 people who were killed by SARS circa 2002-2003. Hoffman adds that the organizer will be increasing a disinfectant program around the site and promoting a “no handshake policy.” As the organization notes, some 5,000-6,000 people from China attend the show each year, accounting for around 5-6 percent of visitors.

The GSMA is clearly interested in addressing concerns over the virus, while limiting further attendee or exhibitor erosion. The release quotes Catalan health minister Alba Vergés, who notes, “The Catalan health system is prepared to detect and treat coronavirus, to give the most appropriate response, and this must be clear to those attending MWC Barcelona.”

Index Fund’s portfolio is driving long-overdue innovation in femcare

U.K. startup Daye is rethinking female intimate care from a woman’s perspective, starting with a tampon infused with cannabidiol that tackles period pain.

It’s also quietly demolishing the retrograde approach to product design that women are still subjected to in the mass market “femcare” space — an anti-philosophy that not only peddles stale and sexist stereotypes, but also can harm women’s bodies.

Those perfumed sanitary pads stinking out the supermarket shelf? Whomever came up with that idea has obviously never experienced thrush or bacterial vaginosis. Nor spoken to a health professional who could have told them vaginal infections can be triggered by perfumed products.

The missing link: There are few people with a vagina in positions leading product strategy. And that’s the disruptive opportunity female-led femcare businesses like Daye are closing in on.

The Index Ventures-backed startup is shaking up a tired category by selling the flip-side: thoughtfully designed products for period care that first do no harm and second take aim at actual problems women have — starting with dysmenorrhea. The overarching strand is building community — to help women better understand what’s going on with their bodies and reinforce shifting product expectations in the process.

We chatted with Index principal Hannah Seal about the fund’s investment in Daye, and to get her thoughts more broadly on a new generation of female-focused startups that are driving long-overdue innovation.

The interview has been edited for length and clarity.

After $479M round on $12.4B valuation, Snowflake CEO says IPO is next step

Snowflake, the cloud-based data warehouse company, doesn’t tend to do small rounds. On Friday night word leaked out about its latest mega round. This one was for $479 million on a $12.4 billion valuation. That’s triple the company’s previous $3.9 billion valuation from October 2018, and CEO Frank Slootman suggested that the company’s next finance event is likely an IPO.

Dragoneer Investment Group led the round along with new investor Salesforce Ventures. Existing Snowflake investors Altimeter Capital, ICONIQ Capital, Madrona Venture Group, Redpoint Ventures, Sequoia, and Sutter Hill Ventures also participated. The new round brings the total raised to over $1.4 billion, according to PitchBook data.

All of this investment begs the question when this company goes public. As you might expect, Slootman is keeping his cards close to the vest, but he acknowledges that is the next logical step for his organization, even if he is not feeling pressure to make that move right now.

“I think the earliest that we could actually pull that trigger is probably early- to mid-summer timeframe. But whether we do that or not is a totally different question because we’re not in a hurry, and we’re not getting pressure from investors,” he said.

He grants that the pressure is about allowing employees to get their equity out of the company, which can only happen once the company goes public. “The only reason that there’s always a sense of pressure around this is because it’s important for employees, and I’m not minimizing that at all. That’s a legitimate thing. So, you know, it’s certainly a possibility in 2020 but it’s also a possibility the year thereafter. I don’t see it happening any later than that,” he said.

The company’s most recent round prior to this was $450 million in October 2018. Slootman says that he absolutely didn’t need the money, but the capital was there, and the chance to forge a relationship with Salesforce also was key in their thinking in taking this funding.

“At a high level, the relationship is really about allowing Salesforce data to be easily accessed inside Snowflake. Not that it’s impossible to do that today because there are lots of tools that will help you do that, but this relationship is about making that seamless and frictionless, which we find is really important,” Slootman said.

Snowflake now has relationships with AWS, Microsoft Azure and Google Cloud Platform, and has a broad content strategy to have as much quality data (like Salesforce) on the platform. Slootman says that this helps induce a network effect, while helping move data easily between major cloud platforms, a big concern as more companies adopt a multiple cloud vendor strategy.

“One of the key distinguishing architectural aspects of Snowflake is that once you’re on our platform, it’s extremely easy to exchange data with other Snowflake users. That’s one of the key architectural underpinnings. So content strategy induces network effect which in turn causes more people, more data to land on the platform, and that serves our business model,” he said.

Slootman says investors want to be part of his company because it’s solving some real data interchange pain points in the cloud market, and the company’s growth shows that in spite of its size, that continues to attract new customers at high rate.

“We just closed off our previous fiscal year which ended last Friday, and our revenue grew at 174%. For the scale that we are, this by far the fastest growing company out there…So, that’s not your average asset,” he said.

The company has 3400 active customers, which he defines as customers who were actively using the platform in the last month. He says that they have added 500 new customers alone in the last quarter.

Watch two rocket launches live, including a Space Station supply flight and a mission to study the Sun

There are two – that’s right two – launches happening this Sunday, and both are set to broadcast live on NASA’s official stream above. The first is a NASA International Space Station resupply mission, with a Norhtrop Grumman Cygnus spacecraft launching aboard an Antares rocket from Wallops Island in Virginia at 5:39 PM EST (2:39 PM PST). The second is the launch of the Solar Orbiter spacecraft, a joint scientific mission by NASA and the European Space Agency (ESA) that’s set to take off aboard a United Launch Alliance (ULA) Atlas V rocket from Cape Canaveral, Florida at 11:03 PM EST (8:03 PM PST).

The ISS resupply mission is the 13th operated by Northrop Grumman, and will carry around 8,000 lbs of experiment materials, supplies for the STation’s astronaut crew, and additional cargo including various cargo. If all goes to plan, the Cygnus spacecraft will get to the Space Station on Tuesday at around 4:30 AM EST, where astronauts on board will capture the spacecraft with the station’s robotic arm for docking.

The NASA/ESA Solar Orbiter mission is a bit more of an event, since it’s a launch of a very special payload with a dedicated mission to study the Sun, launching aboard a brand new custom configuration of ULA’s Atlas V rocket tailor-made for the Orbiter. The Orbiter has a mass of nearly 4,000 lbs, and a wingspan of nearly 60 feet, and is carrying a complement of 10 instruments for gathering data from our Solar System’s central player.

Solar Orbiter will take the first ever direct images of the Sun’s poles once it arrives at our star, but it first has to get there, using the gravitational force of both Earth and Venus to help propel it along its path. Already, the planned launch of Solar Orbiter has been delayed by a few days – and timing is key to making sure those gravitational forces can work as designed to get it to tis goal, so here’s hoping today’s launch goes off as planned.

As its name implies, Solar Orbiter is designed to orbit the Sun – and it’ll do so from a relatively close distance of around 26 million miles away. That’s closer than Mercury, the planet in our solar system closest to the Sun, and at that distance it’ll still face max temperatures of around 520 degrees Celsius (968 degrees Fahrenheit). To endure those temps, the spacecraft is protected by a titanium heat shield that will always be oriented towards the star, and even its solar panels will actually have to tilt away from the Sun during the spacecraft’s closest approach to make sure they don’t get too hot while powering the satellite.

Solar Orbiter will study the Sun’s polar regions, as mentioned, and shed some light on how its magnetic field and emissions of particles from the star affect its surrounding cosmic environment, including the region of space that we inhabit here on Earth. After launch, Orbiter should make its way to Venus for a flyby this December, then cost paths with Earth for a planned approach in November, 2021, before making its first close approach to the Sun in 2022.

Check back above for live views of both launches, with the stream for the first mission kicking off shortly after 5 PM EST (2PM PST).

The war against space hackers: how the JPL works to secure its missions from nation-state adversaries

NASA’s Jet Propulsion Laboratory designs, builds, and operates billion-dollar spacecraft. That makes it a target. What the infosec world calls Advanced Persistent Threats — meaning, generally, nation-state adversaries — hover outside its online borders, constantly seeking access to its “ground data systems,” its networks on Earth, which in turn connect to the ground relay stations through which those spacecraft are operated.

Their presumptive goal is to exfiltrate secret data and proprietary technology, but the risk of sabotage of a billion-dollar mission also exists. Over the last few years, in the wake of multiple security breaches which included APTs infiltrating their systems for months on end, the JPL has begun to invest heavily in cybersecurity.

I talked to Arun Viswanathan, a key NASA cyber security researcher, about that work, which is a fascinating mix of “totally representative of infosec today” and “unique to the JPL’s highly unusual concerns.” The key message is firmly in the former category, though: information security has to be proactive, not reactive.

Each mission at JPL is like its own semi-independent startup, but their technical constraints tend to be very unlike those of Valley startups. For instance, mission software is usually homegrown/innovative, because their software requirements are so much more stringent: for instance, you absolutely cannot have software going rogue and consuming 100% of CPU on a space probe.

Successful missions can last a very long time, so the JPL has many archaic systems, multiple decades old, which are no longer supported by anyone; they have to architect their security solutions around the limitations of that ancient software. Unlike most enterprises, they are open to the public, who tour the facilities by the hundred. Furthermore, they have many partners, such as other space agencies, with privileged access to their systems.

All that … while being very much the target of nation-state attackers. Theirs is, to say the last, an interesting threat model.

Viswanathan has focused largely on two key projects. One is the creation of a model of JPL’s ground data systems — all its heterogeneous networks, hosts, processes, applications, file servers, firewalls, etc. — and a reasoning engine on top of it. This then can be queried programmatically. (Interesting technical side note: the query language is Datalog, a non-Turing-complete offshoot of venerable Prolog which has had a resurgence of late.)

Previous to this model, no one person could confidently answer “what are the security risks of this ground data system?” As with many decades-old institutions, that knowledge was largely trapped in documents and brains.

With the model, ad hoc queries such as “could someone in the JPL cafeteria access mission-critical servers?” can be asked, and the reasoning engine will search out pathways, and itemize their services and configurations. Similarly, researchers can work backwards from attackers’ goals to construct “attack trees,” paths which attackers could use to conceivably reach their goal, and map those against the model, to identify mitigations to apply.

His other major project is to increase the JPL’s “cyber situational awareness” — in other words, instrumenting their systems to collect and analyze data, in real time, to detect attacks and other anomalous behavior. For instance, a spike in CPU usage might indicate a compromised server being used for cryptocurrency mining.

In the bad old days, security was reactive: if someone had a problem and couldn’t access their machine, they’d call, but that was the extent of their observability. Nowadays, they can watch for malicious and anomalous patterns which range from the simple, such as a brute-force attack indicated by many failed logins followed by a successful one, to the much more complex, e.g. machine-learning based detection of a command system operating outside its usual baseline parameters.

Of course, sometimes it’s just an anomaly, not an attack. Conversely, this new observability is also helping to identify system inefficiencies, memory leakage, etcetera, proactively rather than reactively.

This may all seem fairly basic if you’re accustomed to, say, your Digital Ocean dashboard and its panoply of server analygics. But re-engineering an installed base of heterogeneous complex legacy systems for observability at scale is another story entirely. Looking at the borders and interfaces isn’t enough; you have to observe all the behavior inside the perimeter too, especially in light of partners with privileged access, who might abuse that access if compromised. (This was the root cause of the infamous 2018 attack on the JPL.)

While the JPL’s threat model is fairly unique, Viswanathan’s work is quite representative of our brave new world of cyberwarfare. Whether you’re a space agency, a big company, or a growing startup, your information security nowadays needs to be proactive. Ongoing monitoring of anomalous behavior is key, as is thinking like an attacker; reacting after you find out something bad happened is not enough. May your organization learn this the easy way, rather than joining the seemingly endless of headlines telling us all of breach after breach.

Startups Weekly: Asana numbers likely to be what the market wants

[Editor’s note: Want to get this weekly review of news that startups can use? Just subscribe here.] 

Asana may get more attention than the average SaaS company due to the Facebook pedigrees and outspoken views of its founders, but in practice it’s a low-profile, cash-efficient machine. Today, the productivity toolmaker does not need to raise cash via a traditional IPO, as we explored this week following its filing for a direct listing, even though it hasn’t raised that much money compared to other unicorns.

Alex Wilhelm dug into public numbers on Extra Crunch to make an educated guess about its pricing prospects:

Let’s presume that Asana crossed the $100 million ARR mark as 2018 came to a close. And, for the sake of discussion, that its eight quarters of revenue growth acceleration left the company with a 60% expansion rate. Then, Asana would have closed up 2019 with $160 million in ARR. (You can easily change up the numbers by tweaking when the company reached the nine-figure ARR mark and its ensuing growth rate.). …

Asana is likely worth more than its final private valuation of $1.5 billion. Presuming it can get a bog-standard 12x multiple on its ARR, the company would be worth $1.8 billion. If it can do better, or is larger than that, the value of the firm quickly rises.

Unlike Casper’s struggles, and One Medical’s somewhat surprising consumery pop, Asana is a straightforward bet for a good public performance based on traditional SaaS metrics. Stay tuned for more next week.

GettyImages 926051128

VCs are still pouring money into open source

In this week’s investor survey, Arman Tabatabai talked to 18 of the most active and successful investors in open-source and devops software about the latest trends. The money going into the sector has grown by 10% CAGR over the last five years, and nobody he talked to plans to slow down — in fact, many said the market was under-heated, or just halfway there. Why? Every company is trying to become more of a software company, developers now get to make more adoption and purchasing decisions, and there are countless software problems yet to solve.

The investors in Part 1 of the survey on Extra Crunch:

The investors in Part 2:

GettyImages 860704620

The latest startup funds are even more meta

It seems like everyone wants to invest in tech startups these days, including any large company or government body — and even tech startups. In the latest news on this long-running trend, cap table management unicorn Carta is starting its own fund to invest in companies. Given its in-house data and broad relationships in the industry, this seems like great positioning for some hot deals (as long as the clients on the platform don’t mind, of course).

Meanwhile, a couple of successful, currently active founders will also be ramping up their seed investments. Superhuman founder and CEO Rahul Vohra and Eventjoy founder Todd Goldberg are teaming up to create “The Todd & Rahul Angel Fund” which will put $7 million from an LP base of other founders and operators to work. The dollars involved may be small, but the signaling is likely to be very high.

Organized (tech) labor

Silicon Valley investors and founders have avoided unions for decades by giving employees a cut of the ownership directly. But is this arrangement changing? The rise of gig work, the questions about high valuations and future stock prices, the grind of life at many unicorn startups, and general concern about tech culture and ethics have combined to make some workers look harder at unions, as Megan Rose Dickey covered this week in an ongoing series.

Other workers, meanwhile, are striking out to form tech coops that share ownership from the start. She talked to a couple folks on this front as well, including one coop that is helping ride-share drivers to make more money.

Around the horn

Here’s why so many fintech startups are loaning to small businesses (EC)

Europe risks squandering its global advantage in deep tech innovation (TC)

What to expect when pitching European VCs (TC)

Dear Sophie: My H-1B was renewed, but I’m getting laid off (EC)

Latin America takes the global lead in VC directed to female co-founders (TC)

Why VCs are dumping money into insurance marketplaces (EC)

As a top manager leaves amid fundraising woes, SoftBank’s vision looks dimmer — and schadenfreude abounds (TC)

Why this VC thinks we’re heading for a cloud slowdown (EC)

#EquityPod

In this week’s episode, Alex and Danny sat down with Rick Yang of NEA, examined Casper and One Medical in more detail, and covered a few new funds and fundraises — including more thoughts on the Asana numbers. Check it out!

‘A city where you can pilot almost anything and figure out if it’s going to work’

Scott Bade
Contributor

Scott Bade is a former speechwriter for Mike Bloomberg and co-author of “More Human: Designing a World Where People Come First.”

As founding executive director of Tech:NYC, Julie Samuels is one of the state’s most prominent advocates for the tech sector, both in Albany and at City Hall.

Samuels, a lawyer by training, came to New York after serving as executive director of Engine, a San Francisco organization on which Tech:NYC is modeled. In an interview with TechCrunch, Samuels spoke about several issues, including her rationale for why, despite the controversy over Amazon’s decision not to build its second headquarters in Queens, the area is well-positioned for the next wave of tech innovation.

TechCrunch: What is the need for organizations like Tech:NYC and Engine?

Julie Samuels: As the tech industry matures, it is incredibly important that there are organizations [that] represent these companies politically, civically, making sure they have a seat at the table with so many public policy debates. There is no shortage of public policy debates surrounding technology.

It is also incredibly important that there are organizations who are talking from the viewpoint of smaller companies and startups. There are a lot of organizations that represent the biggest and most well-known companies, including Tech:NYC. But [we] also have hundreds of members who are small and growing startups. We think that diversity of the ecosystem is what really sets the technology sector apart and it is something we want to foster and celebrate.

Who are your members, then?

Why your next TV needs ‘filmmaker mode’

TVs this year will ship with a new feature called “filmmaker mode,” but unlike the last dozen things the display industry has tried to foist on consumers, this one actually matters. It doesn’t magically turn your living room into a movie theater, but it’s an important step in that direction.

This new setting arose out of concerns among filmmakers (hence the name) that users were getting a sub-par viewing experience of the media that creators had so painstakingly composed.

The average TV these days is actually quite a quality piece of kit compared to a few years back. But few ever leave their default settings. This was beginning to be a problem, explained LG’s director of special projects, Neil Robinson, who helped define the filmmaker mode specification and execute it on the company’s displays.

“When people take TVs out of the box, they play with the settings for maybe five minutes, if you’re lucky,” he said. “So filmmakers wanted a way to drive awareness that you should have the settings configured in this particular way.”

In the past they’ve taken to social media and other platforms to mention this sort of thing, but it’s hard to say how effective a call to action is, even when it’s Tom Cruise and Chris McQuarrie begging you:

I’m taking a quick break from filming to tell you the best way to watch Mission: Impossible Fallout (or any movie you love) at home. pic.twitter.com/oW2eTm1IUA

— Tom Cruise (@TomCruise) December 4, 2018

While very few people really need to tweak the gamma or adjust individual color levels, there are a couple settings that are absolutely crucial for a film or show to look the way it’s intended. The most important are ones that fit under the general term “motion processing.”

These settings have a variety of fancy-sounding names, like “game mode,” “motion smoothing,” “truemotion,” and such like, and they are on by default on many TVs. What they do differs from model to model, but it amounts to taking content at, say, 24 frames per second, and converting it to content at, say, 120 frames per second.

Generally this means inventing the images that come between the 24 actual frames — so if a person’s hand is at point A in one frame of a movie and point C in the next, motion processing will create a point B to go in between — or B, X, Y, Z, and dozens more if necessary.

This is bad for several reasons:

First, it produces a smoothness of motion that lies somewhere between real life and film, giving an uncanny look to motion-processed imagery that people often say reminds them of bad daytime TV shot on video — which is why people call it the “soap opera effect.”

Second, some of these algorithms are better than others, and some media is more compatible than the rest (sports broadcasts, for instance). While at best they produce the soap opera effect, at worst they can produce weird visual artifacts that can distract even the least sensitive viewer.

And third, it’s an aesthetic affront to the creators of the content, who usually crafted it very deliberately, choosing this shot, this frame rate, this shutter speed, this take, this movement, and so on with purpose and a careful eye. It’s one thing if your TV has the colors a little too warm or the shadows overbright — quite another to create new frames entirely with dubious effect.

So filmmakers, and in particular cinematographers, whose work crafting the look of the movie is most affected by these settings, began petitioning TV companies to either turn motion processing off by default or create some kind of easily accessible method for users to disable it themselves.

Ironically, the option already existed on some displays. “Many manufacturers already had something like this,” said Robinson. But with different names, different locations within the settings, and different exact effects, no user could really be sure what these various modes actually did. LG’s was “Technicolor Expert Mode.” Does that sound like something the average consumer would be inclined to turn on? I like messing with settings, and I’d probably keep away from it.

So the movement was more about standardization than reinvention. With a single name, icon, and prominent placement instead of being buried in a sub-menu somewhere, this is something people may actually see and use.

Not that there was no back-and-forth on the specification itself. For one thing, filmmaker mode also lowers the peak brightness of the TV to a relatively dark 100 nits — at a time when high brightness, daylight visibility, and contrast ratio are specs manufacturers want to show off.

The reason for this is, very simply, to make people turn off the lights.

There’s very little anyone in the production of a movie can do to control your living room setup or how you actually watch the film. But restricting your TV to certain levels of brightness does have the effect of making people want to dim the lights and sit right in front. Do you want to watch movies in broad daylight, with the shadows pumped up so bright they look grey? Feel free, but don’t imagine that’s what the creators consider ideal conditions.

Photo: Chris Ryan / Getty Images

“As long as you view in a room that’s not overly bright, I’d say you’re getting very close to what the filmmakers saw in grading,” said Robinson. Filmmaker mode’s color controls are a rather loose, he noted, but you’ll get the correct aspect ratio, white balance, no motion processing, and generally no weird surprises from not delving deep enough in the settings.

The full list of changes can be summarized as follows:

  • Maintain source frame rate and aspect ratio (no stretched or sped up imagery)
  • Motion processing off (no smoothing)
  • Peak brightness reduced (keeps shadows dark — this may change with HDR content)
  • Sharpening and noise reduction off (standard items with dubious benefit)
  • Other “image enhancements” off (non-standard items with dubious benefit)
  • White point at D65/6500K (prevents colors from looking too warm or cool)

All this, however, relies on people being aware of the mode and choosing to switch to it. Exactly how that will work depends on several factors. The ideal option is probably a filmmaker mode button right on the clicker, which is at least theoretically the plan.

The alternative is a content specification — as opposed to a display one — that allows TVs to automatically enter filmmaker mode when a piece of media requests it to. But this requires content providers to take advantage of the APIs that make the automatic switching possible, so don’t count on it.

And of course this has its own difficulties, including privacy concerns — do you really want your shows to tell your devices what to do and when? So a middle road where the TV prompts the user to “Show this content in filmmaker mode? Yes/No” and automatic fallback to the previous settings afterwards might be the best option.

There are other improvements that can be pursued to make home viewing more like the theater, but as Robinson pointed out, there are simply fundamental differences between LCD and OLED displays and the projectors used in theaters — and even then there are major differences between projectors. But that’s a whole other story.

At the very least, the mode as planned represents a wedge that content purists (it has a whiff of derogation but they may embrace the term) can widen over time. Getting the average user to turn off motion processing is the first and perhaps most important step — everything after that is incremental improvement.

So which TVs will have filmmaker mode? It’s unclear. LG, Vizio, and Panasonic have all committed to bringing models out with the feature, and it’s even possible it could be added to older models with a software update (but don’t count on it). Sony is a holdout for now. No one is sure exactly which models will have filmmaker mode available, so just cast an eye over the spec list of you’re thinking of getting and, if you’ll take my advice, don’t buy a TV without it.

California’s new privacy law is off to a rocky start

California’s new privacy law was years in the making.

The law, California’s Consumer Privacy Act — or CCPA — became law on January 1, allowing state residents to reclaim their right to access and control their personal data. Inspired by Europe’s GDPR, the CCPA is the largest statewide privacy law change in a generation. The new law lets users request a copy of the data that tech companies have on them, delete the data when they no longer want a company to have it, and demand that their data isn’t sold to third parties. All of this is much to the chagrin of the tech giants, some of which had spent millions to comply with the law and have many more millions set aside to deal with the anticipated influx of consumer data access requests.

But to say things are going well is a stretch.

Many of the tech giants that kicked and screamed in resistance to the new law have acquiesced and accepted their fate — at least until something different comes along. The California tech scene had more than a year to prepare, but some have made it downright difficult and — ironically — more invasive in some cases for users to exercise their rights, largely because every company has a different interpretation of what compliance should look like.

Alex Davis is just one California resident who tried to use his new rights under the law to make a request to delete his data. He vented his annoyance on Twitter, saying companies have responded to CCPA by making requests “as confusing and difficult as possible in new and worse ways.”

“I’ve never seen such deliberate attempts to confuse with design,” he told TechCrunch. He referred to what he described as “dark patterns,” a type of user interface design that tries to trick users into making certain choices, often against their best interests.

“I tried to make a deletion request but it bogged me down with menus that kept redirecting… things to be turned on and off,” he said.

Despite his frustration, Davis got further than others. Just as some companies have made it easy for users to opt-out of having their data sold by adding the legally required “Do not sell my info” links on their websites, many have not. Some have made it near-impossible to find these “data portals,” which companies set up so users can request a copy of their data or delete it altogether. For now, California companies are still in a grace period — but have until July when the CCPA’s enforcement provisions kick in. Until then, users are finding ways around it — by collating and sharing links to data portals to help others access their data.

“We really see a mixed story on the level of CCPA response right now,” said Jay Cline, who heads up consulting giant PwC’s data privacy practice, describing it as a patchwork of compliance.

PwC’s own data found that only 40% of the largest 600 U.S. companies had a data portal. Only a fraction, Cline said, extended their portals to users outside of California, even though other states are gearing up to push similar laws to the CCPA.

But not all data portals are created equally. Given how much data companies store on us — personal or otherwise — the risks of getting things wrong are greater than ever. Tech companies are still struggling to figure out the best way to verify each data request to access or delete a user’s data without inadvertently giving it away to the wrong person.

Last year, security researcher James Pavur impersonated his fiancee and tricked tech companies into turning over vast amounts of data about her, including credit card information, account logins and passwords and, in one case, a criminal background check. Only a few of the companies asked for verification. Two years ago, Akita founder Jean Yang described someone hacking into her Spotify account and requesting her account data as an “unfortunate consequence” of GDPR, which mandated companies operating on the continent allow users access to their data.

(Image: Twitter/@jeanqasaur)

The CCPA says companies should verify a person’s identity to a “reasonable degree of certainty.” For some that’s just an email address to send the data.

Others require sending in even more sensitive information just to prove it’s them.

Indeed, i360, a little-known advertising and data company, until recently asked California residents for a person’s full Social Security number. This recently changed to just the last four-digits. Verizon (which owns TechCrunch) wants its customers and users to upload their driver’s license or state ID to verify their identity. Comcast asks for the same, but goes the extra step by asking for a selfie before it will turn over any of a customer’s data.

Comcast asks for the same amount of information to verify a data request as the controversial facial recognition startup, Clearview AI, which recently made headlines for creating a surveillance system made up of billions of images scraped from Facebook, Twitter and YouTube to help law enforcement trace a person’s movements.

As much as CCPA has caused difficulties, it has helped forge an entirely new class of compliance startups ready to help large and small companies alike handle the regulatory burdens to which they are subject. Several startups in the space are taking advantage of the $55 billion expected to be spent on CCPA compliance in the next year — like Segment, which gives customers a consolidated view of the data they store; Osano which helps companies comply with CCPA; and Securiti, which just raised $50 million to help expand its CCPA offering. With CCPA and GDPR under their belts, their services are designed to scale to accommodate new state or federal laws as they come in.

Another startup, Mine, which lets users “take ownership” of their data by acting as a broker to allow users to easily make requests under CCPA and GDPR, had a somewhat bumpy debut.

The service asks users to grant them access to a user’s inbox, scanning for email subject lines that contain company names and using that data to determine which companies a user can request their data from or have their data deleted. (The service requests access to a user’s Gmail but the company claims it will “never read” users’ emails.) Last month during a publicity push, Mine inadvertently copied a couple of emailed data requests to TechCrunch, allowing us to see the names and email addresses of two requesters who wanted Crunch, a popular gym chain with a similar name, to delete their data.

(Screenshot: Zack Whittaker/TechCrunch)

TechCrunch alerted Mine — and the two requesters — to the security lapse.

“This was a mix-up on our part where the engine that finds companies’ data protection offices’ addresses identified the wrong email address,” said Gal Ringel, co-founder and chief executive at Mine. “This issue was not reported during our testing phase and we’ve immediately fixed it.”

For now, many startups have caught a break.

The smaller, early-stage startups that don’t yet make $25 million in annual revenue or store the personal data on more than 50,000 users or devices will largely escape having to immediately comply with CCPA. But it doesn’t mean startups can be complacent. As early-stage companies grow, so will their legal responsibilities.

“For those who did launch these portals and offer rights to all Americans, they are in the best position to be ready for these additional states,” said Cline. “Smaller companies in some ways have an advantage for compliance if their products or services are commodities, because they can build in these controls right from the beginning,” he said.

CCPA may have gotten off to a bumpy start, but time will tell if things get easier. Just this week, California’s attorney general Xavier Becerra released newly updated guidance aimed at trying to “fine tune” the rules, per his spokesperson. It goes to show that even California’s lawmakers are still trying to get the balance right.

But with the looming threat of hefty fines just months away, time is running out for the non-compliant.

This Week in Apps: Chinese giants take on Google Play, Iowa caucus disaster, TikTok’s power over App Store charts

Welcome back to This Week in Apps, the Extra Crunch series that recaps the latest OS news, the applications they support and the money that flows through it all.

The app industry is as hot as ever with a record 204 billion downloads in 2019 and $120 billion in consumer spending in 2019, according to App Annie’s recently released “State of Mobile” annual report. People are now spending 3 hours and 40 minutes per day using apps, rivaling TV. Apps aren’t just a way to pass idle hours — they’re a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus.

In this Extra Crunch series, we help you keep up with the latest news from the world of apps, delivered on a weekly basis.

This week, we look at the app making headlines for causing a disaster in Iowa, TikTok’s power to move apps up the charts, all the news from Apple’s new betas, the plan from Chinese mobile giants to take on Google Play, subscription scams, plus app trends and other news.

Headlines

Iowa’s caucus app was a disaster

A smartphone app really screwed things up in Iowa. The app, built by Shadow Inc., was designed to help the Iowa Democratic Party tabulate votes from the caucuses. But instead of helping, the app failed, causing a massive delay of almost an entire day. According to The New York Times, the app was quickly put together in just the past two months — and wasn’t properly tested.

Facebook has acquired Scape Technologies, the London-based computer vision startup

Scape Technologies, the London-based computer vision startup working on location accuracy beyond the capabilities of GPS, has been acquired by Facebook, according to a regulatory filing.

Full terms of the deal remain as yet unknown, although a Companies House update reveals that Facebook Inc. now has majority control of the company (more than 75%). However by looking at other filings, including a recent share issue, I understand the price could be about $40 million.

Further filings show that Scape’s previous venture capital representatives have resigned from the Scape board and are replaced by two Facebook executives.

Scape’s backers included Entrepreneur First (EF) — the startup is an alumni of the company builder program — along with LocalGlobe, Mosaic Ventures, and Fly Ventures.

Noteworthy is that EF and Fly Ventures have both already had a joint exit to Facebook of sorts, when Bloomsbury AI was acqui-hired by the social networking behemoth (a story that I also broke).

Founded in 2017, Scape Technologies was developing a “Visual Positioning Service” based on computer vision which lets developers build apps that require location accuracy far beyond the capabilities of GPS alone.

The technology initially targeted augmented reality apps, but also had the potential to be used to power applications in mobility, logistics and robotics. More broadly, Scape wanted to enable any machine equipped with a camera to understand its surroundings.

Scape CEO and co-founder Edward Miller previously described Scape’s “Vision Engine” as a large-scale mapping pipeline that creates 3D maps from ordinary images and video. Camera devices can then query the Vision Engine using the startup’s “Visual Positioning Service” API to determine their exact location with far greater precision than GPS can ever provide. The Visual Positioning Service was made available to select developers via Scape’s SDK.

Meanwhile the acquisition by Facebook, no matter what form it takes, looks like a good fit given the U.S. company’s investment in next generation platforms, including VR and AR. It is also another — perhaps, worrying — example of U.S. tech companies hoovering up U.K. machine learning and AI talent early.

Update: A Facebook spokesperson provided the following statement: “We acquire smaller tech companies from time to time. We don’t always discuss our plans.”

Uber claims top spot in Indian ride-hailing market

Uber facilitated 14 million rides a week in India last year, the American ride-hailing firm said as it claimed the tentpole position in the key overseas market.

In a report (PDF) published on the sidelines of its quarterly earnings Thursday afternoon, Uber said that it commanded over 50% of the ride-hailing market in India — among some other regions — and was the category leader.

The publicly listed company cited its internal estimations for the claim, it said. In comparison, Uber handled 11 million rides a week in India in 2018, a spokesperson told TechCrunch.

Slide from Uber’s report

The revelation is especially interesting, since both Uber and its chief local rival Ola have tended to avoid talks about the number of rides they serve in India.

In a 2018 blog post, Ola revealed that its platform “moves over two million people every day.” A spokesperson for the Indian startup, which like Uber counts SoftBank as an investor, declined to reveal the new figures, but issued a statement in which it identified itself as India’s “largest mobility platform.”

“As India’s largest mobility platform, Ola serves over 200 million customers through a network of 2.5 million driver-partners across a wide range of offerings including two, three and four-wheelers,” the spokesperson said, adding that the ride-hailing firm operates in 250 cities and towns in India.

Last month, Uber sold its food delivery Uber Eats’ India business to local rival Zomato for about $180 million in a move that some analysts said could help the ride-hailing firm better focus on its core business in the country.

An Uber spokesperson told TechCrunch that the company plans to expand from about 50 Indian cities where it currently operates to 200 in the country by the end of the year. It will focus on onboarding two-wheelers and three-wheelers in many of these cities, the firm said.

Uber’s expansion in India comes as Ola is entering one of the American firm’s key territories. Last week, Ola said it will begin operation in London on February 10.