RSS Feeds

The elcomCMS RSS Module lets you easily display external RSS feeds on your site. You also have the ability to publish  configurable feeds of content to anyone with an RSS reader allowing both internal or external users to subscribe via RSS to real time content updates.



  1. Deezer says AI-made songs make up 44 percent of daily uploadsTuesday, 21 April 2026

    AI-generated music is spreading like wildfire, according to Deezer, who reported receiving nearly 75,000 uploads of AI-made tracks a day on its platform. The alternative music streaming service based in Paris published a report revealing that 44 percent of its daily uploads are AI-generated songs, accumulating to around 2 million flagged songs a month. If that figure doesn't alarm you, Deezer said that more than 13.4 million songs were detected and flagged as AI-generated across 2025.

    Those statistics are made possible with Deezer's patent-pending AI music detection tool, which was launched in January 2025. A few months following the release, Deezer announced that it saw around 20,000 AI-generated tracks uploaded a day, which made up roughly 18 percent of its overall uploads. Despite the swell of AI music on its platform, Deezer said that only about 1 to 3 percent of total streams on the platform involve AI-generated music and that a majority of these streams are marked as fraudulent and demonetized.

    Deezer said its proprietary tool can detect AI-generated music, particularly from two of the most popular offerings right now: Suno and Udio. Despite these two AI music tools getting hit with lawsuits in their early days, some major record labels have had a change of heart and later struck deals with the startups. On the other hand, other music streaming platforms are employing their own verification tools to fortify the floodgates holding back music made by AI. Similar to Deezer, Coda Music uses "AI Artist" labels and even let users flag suspicious artists.

    This article originally appeared on Engadget at https://www.engadget.com/entertainment/music/deezer-says-ai-made-songs-make-up-44-percent-of-daily-uploads-163642921.html?src=rss
  2. DaVinci Resolve 21 hands-on: A viable Lightroom alternative for casual usersTuesday, 21 April 2026

    Blackmagic Design’s DaVinci Resolve is a highly capable free color grading tool with a history dating way back to the ‘80s, but it has never been thought of as a rival to Adobe’s Lightroom due to its video origins. Now, Blackmagic Design has released a new version in beta that may change people’s minds about that. The new Photo page lets you import RAW images then adjust them using Resolve’s powerful color grading tools. You even get access to advanced VFX and AI features not found in Lightroom.

    When I saw the new feature, I immediately wondered if I could cancel my $20 per month Adobe Photography subscription (with Lightroom CC and Photoshop CC). Apparently, I'm not alone. After trying it out, I believe that I could do so because photos are secondary to video for me. However, photographers who’ve used Lightroom for a long time would likely find it too painful to switch — at least, for now.

    I tested the new Photo page functions and many of Resolve’s new filter effects, but beware that the first beta is still buggy. I used it inside the $295 DaVinci Resolve Studio app (which includes free updates for life), because it has a few extra features not found in the free version.

    With that said, DaVinci Resolve 21 now supports RAW photos from Canon, Fujifilm, Nikon and Sony, with other brands to follow. Blackmagic has pledged to support RAW files for new cameras shortly after they’re released like Adobe does with Lightroom. It also supports TIFFs, JPEGs, HEIFs and other photo file formats.

    DaVinci Resolve 21 hands-on: A viable Lightroom alternative for casual users
    Steve Dent for Engadget

    To organize files, you can think of Resolve’s Projects as equivalent to Lightroom Catalogs. You import photos into a Project just as you do video, by dragging and dropping photos or folders into the media pool or using the “import” function. Resolve’s dedicated Media file management page also supports RAW photos. I find Resolve’s import system to be easier and more logical than Lightroom’s, with less steps required. You can import a full or partial Lightroom catalog into Resolve as well.

    Once your photos are in the media pool, you can select and organize them by file name, rating, colors, favorites and other tags. DaVinci Resolve Studio also offers a new feature called AI IntelliSearch that lets you visually identify photos based on their content using terms like “cats” or “dancing.”

    Photos can then be moved from the media pool into Albums, a new feature that’s similar to Lightroom’s Collections. Albums activate several photo-specific features in the Color and Edit pages. In Edit, Albums appear as simplified, single-track timelines, with each photo shown as a two second clip. That way, you can work with photos in the Color and Fusion pages just as you do with video.

    You can reframe and crop images inside Photo (either by typing in the size or dragging) and make basic RAW-style adjustments for settings like exposure, highlights and shadows. For more advanced grading (like you may do in Lightroom’s Develop), you need to jump into the Color page.

    On the Color page, you get the same functions for photos as video: primary and log color correction, curves, qualifiers, power windows, noise reduction and sharpening. You can also employ Resolve’s class-leading scopes, including parades, waveforms, vectorscopes and histograms.

    Once you’ve created an Album, you can select it at the top of the Color page viewer, just as you would a video timeline. You can also label and sort photos as you do in the Photo page.

    DaVinci Resolve 21 hands-on: A viable Lightroom alternative for casual users
    Steve Dent for Engadget

    Resolve's node-based workflow really shines for photo editing. You can add nodes in series or parallel to build complex grades, then save and apply those grades to multiple images or an entire photo album. Resolve’s system for doing this via “stills” that show your grade is more visual and powerful than the one in Lightroom. You also get support for Resolve’s functions used for video like Look-Up Tables (LUTs) and the new Film Look Creator effect.

    All of Resolve’s filter-style effects — like Vignette, Lens Blur and Film Damage — are available directly from the Photo page. Those include some of DaVinci Resolve Studio’s new AI effects (not available in the free version) like AI CineFocus, AI Face Age Transformer and AI Ultrafocus. This gives the app a leg up over Lightroom, which only offers comparable features via third-party plugins.

    If you want even more advanced effects, the Fusion page is Resolve’s equivalent to After Effects. There, you’ll find tools like warping, lights and Paint, which lets you do Photoshop-like cloning. Resolve 21 now includes the Krokodove filters with features like warping and text animation.

    This raises the question of whether you can do multi-image compositing in the Photo page like you can in Photoshop or After Effects. In short, it’s not possible as Photo only supports one image at a time. However, once you’ve adjusted a RAW image, you can drop it into a video timeline where your color adjustments and other tweaks will carry over. Then, you can stack multiple images and use any of Resolve’s compositing tools from the Edit or Fusion pages. This is pretty clunky compared to using Photoshop, but it’s the only way to combine multiple images for now.

    DaVinci Resolve 21 hands-on: A viable Lightroom alternative for casual users
    DaVinci's Resolve 21's updated export page for photos
    Steve Dent for Engadget

    Once you’ve finished grading and adjusting images, there are two ways to export them. One is by the Quick Export function that provides minimal settings like file type, name and resolution. The better method is via Resolve’s Deliver page, which now has dedicated photo functions when you’re working with an Album. There, you can control size parameters like short and long side, width and height or percentage. You can also change the file type, resolution and quality. This function is severely lacking compared to Lightroom though, which offers advanced settings missing from Resolve, like content credentials, watermarking and post-processing.

    Finally, another new dedicated feature within the Resolve Photo page is Capture Live View for camera tethering, which only supports Canon and Sony cameras for now. It allows you to connect a camera to your PC via USB-C and control aperture, shutter speed, ISO and exposure compensation directly from the app. You can also view your images using Resolve’s scopes and tweak RAW settings like white balance, temperature, shadows, highlights and more.

    DaVinci Resolve’s new Photo page can do most of what Lightroom does in terms of image adjustments, while adding powerful effects tools that its Adobe counterpart lacks. It’s not yet a substitute for Photoshop, though, as it lacks the organizing, exporting, compositing and pixel-level editing tools found in that app.

    For now, the Photo page is ideal for filmmakers who dabble in photo editing, along with hobbyists and power users familiar with Resolve’s formidable grading tools. However, professional photographers may want to stick with Lightroom, because Resolve still lacks certain advanced features, particularly around organization and exporting.

    The new DaVinci Resolve Photo page only just launched and is bound to improve greatly over time. If you’re on the fence, download the free version and see if it works for you. A lot of video editors made the same switch from Premiere and have never looked back. Given the current grumbling about Adobe’s subscriptions, I could see many people making the same move from Lightroom to Resolve.

    This article originally appeared on Engadget at https://www.engadget.com/apps/davinci-resolve-21-hands-on-a-viable-lightroom-alternative-for-casual-users-160520123.html?src=rss
  3. Dyson PencilVac Fluffycones review: Almost the perfect floor cleaner for tiny apartmentsTuesday, 21 April 2026

    The big deal with Dyson’s new vacuum is how small it is. While “pencil” is certainly an ambitious noun to compare a floor vacuum to, the slender body of the PencilVac Fluffycones ($600) brings to mind mops and brooms rather than hulking cyclone-suction tech and connected cleaning. As we described it last year, it’s the company's most stick-like stick vacuum yet. Dyson has been repurposing its engine tech into smaller form factors for years, such as its hair dryers. However, this is the first time it’s been utilized for floor cleaning.

    Dyson recently launched the PencilVac Fluffy, with a more traditional dual-roller system, but I’m focusing on the Fluffycones iteration and will call it simply the PencilVac for brevity. It combines several delightful design features, and while surprisingly potent, it’s — predictably — not quite able to match the power of its bigger brothers. The entire vacuum weighs under four pounds, which adds to its ease of use. That said, it’s heavier than it looks, as the rod shape holds everything inside.

    With a 40mm-diameter (almost 1.6-inch) handle, it’s usable even with a single hand, if you want to be extra casual in your cleaning habits. It’s also a delight to use. As with several previous Dysons, like the OmniGlide, the dual-roller system seems to help the PencilVac glide across hard floors. It can be pulled and coaxed around furniture, table legs, under low-profile credenzas and more.

    When at rest, the PencilVac sits on four central wheels, but once you start cleaning, the suction creates a sort of floating effect. Those four Fluffycone heads are designed to resist hair tangles, with any captured hair bundled like yarn at the tips of each cone for easier cleaning at the center of the PencilVac. The tips of each cone mean the vacuum can reach the edges of my flooring, too. The Fluffycones aren’t really able to dig into the pile, meaning that while they can certainly pull and lift a layer of dirt and dust, anything deeper will stay there.

    Dyson PencilVac Fluffycones review
    Image by Mat Smith for Engadget

    That said, the PencilVac is a dedicated hard floor cleaner. If your home is entirely hardwood or has lots of tile, then it’s not an issue. But if you have carpeted rooms (or large rugs), you might want to consider other models. Also, at higher suction levels, the PencilVac’s rollers would occasionally cut out while vacuuming carpets and rugs.

    The PencilVac is also equipped with front and back Dyson Detect lasers, making it easier to see where you’ve missed (and how terrible your floor cleaning habits are). Compared to bigger, more powerful models like the V15 Detect, the PencilVac had a greater tendency to spit out dirt when overwhelmed by larger amounts of debris. This only happened for me when I pushed the limits of what the PencilVac could handle: a handful of garden soil on my hardwood floor.

    Because it’s 2026, of course, it’s a connected vacuum, too. When I put together and charged the PencilVac, the first thing it does is project a QR code for pairing. A quick firmware update later and it was ready for use. The only feature in the MyDyson app that you might find useful is maintenance reminders (i.e., when to clean the filter). Everything else, including a battery readout, is built into the handle.

    Dyson PencilVac Fluffycones review
    Image by Mat Smith for Engadget

    Everything else is built into the handle. Compared to Dyson’s other stick vacs, there is no removable dust canister or battery block. The collection bin is cleverly integrated, drawing in dirt and using the same suction to compact it with force. When you need to empty the compartment — which holds a surprising amount (0.8 liters, according to Dyson) — you remove the cleaning head, point the body at your garbage can and ‘slide’ the dust (compacting it further in the process) into your trash can. The collection area is also see-through, so you can see everything you’re pulling into the PencilVac build up like a sort of gauge. When emptying the stick vac, I also noticed it doesn’t produce the dust cloud you often get with vacuums, which is another nice improvement.

    Unlike its other stick vacuums, Dyson’s PencilVac comes with a free-standing charging dock, instead of a wall mount. This makes storage a little more versatile. If you’re a renter, there’s no need for any drilling. I think this also speaks to how it’s meant to be used: briefly, in bursts.

    To that point, battery life seemed to be around 20 minutes, depending on the power level used and how dirty the surfaces were. Running entirely on boost, I got under 10 minutes of use. This weakness is exacerbated by a sluggish charging time of over three hours. For smaller spaces and not-too-messy lives, that’s more than enough clean time, though.

    Dyson PencilVac Fluffycones review
    Image by Mat Smith for Engadget

    The PencilVac also comes with that Dyson premium, too, priced at $600. That’s still less than the company’s other recent stick vacuums. For example, that’s $50 less than the aforementioned V15 Detect. If your home has a lot of carpeting or rugs to clean, the bigger, more powerful models might be a better choice. The newer PencilVac Fluffy has a more traditional dual-roller system and it's also $150 cheaper.

    With its minimalist form factor, the PencilVac is still an engineering marvel. Its high degree of mobility makes it easy to clean in tight corners and between furniture. I just wish it were slightly more powerful.

    This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/dyson-pencilvac-fluffycones-review-almost-the-perfect-floor-cleaner-for-tiny-apartments-154550089.html?src=rss
  4. The Elden Ring movie hits theaters on March 3, 2028Tuesday, 21 April 2026

    Bandai Namco and A24 have announced that the Elden Ring movie will hit theaters on March 3, 2028. Filming is set to begin in the next several weeks. The movie was first revealed over a year ago, so this is a welcome update. 

    We also got a full cast announcement, though the companies haven't said who or what everyone is portraying. The cast includes Kit Connor from Heartstopper, Ben Whishaw from the beloved Paddington movies and Cailee Spaeny from Alien: Romulus. Peter Serafinowicz, Jonathan Pryce, Nick Offerman and Sonoya Mizuno will also appear in the film.

    Elden Ring will be written and directed by Alex Garland, fresh off the harrowing Civil War. Garland has directed plenty of sci-fi, with credits like Ex Machina, Annihilation and the woefully underrated TV show Devs. He hasn't, however, made any legit fantasy, so we'll have to see how he handles the magic-filled continent known as The Lands Between.

    In any event, we have nearly two years before finding out. By that time, theaters will have already experienced two new Avengers and Star Wars films. Elden Ring, the game, is getting some new DLC content this year with armor sets, weapons and skins.

    This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/the-elden-ring-movie-hits-theaters-on-march-3-2028-154411670.html?src=rss
  5. Artemis II commander shares a remarkable video of Earth vanishing behind the MoonTuesday, 21 April 2026

    We’ve seen some astonishing photos of an Earthset — the Earth setting behind the Moon — from the Artemis II crew’s history-making trip around our planet’s closest neighbor. Now, Reid Wiseman, the mission’s commander, has shared a remarkable video of that same phenomenon.

    While mission specialist Christina Koch was using a Nikon camera to snap stunning still images of the Earthset, Wiseman used an iPhone 17 Pro Max to film the moment. “I could barely see the Moon through the docking hatch window but the iPhone was the perfect size to catch the view… This is uncropped, uncut with 8x zoom which is quite comparable to the view of the human eye,” he wrote on X.

    This was the first time that human eyes had witnessed an Earthset in 54 years since the Apollo 17 mission. The Artemis II crew flew more than 5,000 miles beyond the Moon as they travelled more than a quarter of a million miles away from Earth — the furthest any humans have ever been from terra firma.

    I, like many people, overuse the word “awesome.” It should only really be used when something actually inspires awe. This video absolutely meets that mark. It’s genuinely awesome.

    This article originally appeared on Engadget at https://www.engadget.com/science/space/artemis-ii-commander-shares-a-remarkable-video-of-earth-vanishing-behind-the-moon-152036403.html?src=rss
  6. Apple could be fined up to $38 billion by Indian antitrust regulatorTuesday, 21 April 2026

    Apple's refusal to provide financial data to an Indian regulatory agency as part of an antitrust case will culminate in a final hearing on May 21, as first reported by Reuters. According to the Competition Commission of India (CCI), Apple still hasn't submitted information about its financials and its views on an antitrust investigation that started in October 2024.

    The case revolves around the CCI accusing Apple of exploiting its dominant position with the App Store, arguing that developers are forced to use Apple's proprietary system for in-app purchases. Apple countered that Android was the more dominant smartphone operating system in India and that iPhones held a smaller market share in India. However, Apple has slowly been gaining momentum with its share of the Indian smartphone market, hitting nine percent in 2025, according to data from Counterpoint Research.

    Reuters reported that the latest CCI order said that Apple had plenty of opportunities to file objections or suggestions, but added that the company still hadn't submitted the "requisite financial information," which is used to determine the amount of a potential penalty. Apple argued that the penalties could be up to $38 billion and responded to the order by citing a separate case where the tech giant challenged the country's antitrust penalty law.

    It's not the first time Apple has butted heads with the Indian government, as it previously refused to pre-install a state-owned app called Sanchar Saathi onto its smartphones. The Indian government later decided to withdraw its mandate requiring smartphone makers to install the app, but it's much less willing to budge on this antitrust case. According to Reuters, the CCI offered Apple two more weeks to file any responses before the final hearing date next month.

    This article originally appeared on Engadget at https://www.engadget.com/big-tech/apple-could-be-fined-up-to-38-billion-by-indian-antitrust-regulator-150821172.html?src=rss
  7. The Mandalorian and Grogu director used Apple Vision Pro to preview the film in IMAXTuesday, 21 April 2026

    Director Jon Favreau (Iron Man, The Jungle Book) hasn't been shy about embracing new technology for filmmaking. While producing The Mandalorian for Disney+, he was one of the first filmmakers to use ILM's massive LED screens, AKA "The Volume," to produce more realistic lighting and backgrounds on studio sets. For the feature film The Mandalorian and Grogu, which hits theaters May 22, Favreau recently revealed that he had Disney build an Apple Vision Pro app to preview its full IMAX scope during filming.

    "So I'm making an IMAX movie, and I'm looking at a TV screen, and no matter how big your TV screen is it's not an IMAX screen," Favreau said in a recent episode of The Town podcast. "We built software so that I can pop on my Apple Vision Pro and be sitting in an IMAX movie theater and see the full aspect ratio when we're lining a shot up. And I can watch that take and see what people will see."

    Favreau isn't the first director to use the Apple Vision Pro — Wicked filmmaker Jon Chu also used it to handle post-production work — but he's the first to specifically mention using the headset for IMAX production. That's still a relatively limited use case for the Apple Vision Pro, but it's one that could be useful to future filmmakers. With its large field of view and sharp micro-OLED screens, the Apple Vision Pro is one of the only ways to replicate the experience of watching a large IMAX screen at home. (The Meta Quest 3 comes in as a close second.)

    In general, Favreau says he's more excited about using existing consumer technology in the filmmaking process than AI. He mentions using the Unreal Engine to previsualize special effects on The Mandalorian and his previous films, and he believes the quality from game engines could be good enough to make it into final productions down the line.

    "This is what the animation industry has understood from the beginning," he said. "Get it right before you ever paint a cel."

    This article originally appeared on Engadget at https://www.engadget.com/entertainment/tv-movies/the-mandalorian-and-grogu-director-used-apple-vision-pro-to-preview-the-film-in-imax-140331311.html?src=rss
  8. GoPro’s Mission 1 camera series will start at $600Monday, 20 April 2026

    We heard all about GoPro's new action camera series last week, but the company is now unveiling the pricing across its Mission 1, Mission 1 Pro and Mission 1 Pro ILS cameras. The entry-level Mission 1 ($600) features GoPro's new 50-megapixel 1-inch sensor, which the company says will offer a major leap in image quality and low-light performance over the Hero 13 line. While largely looking the same as the Hero series (and still waterproof), the Mission 1 can record 8K video at 30fps and 4K at 120fps. It lacks the higher frame rates of the other Mission 1 cameras, but supports 10-bit GP-Log2 color and 32-bit float audio.

    The Mission 1 Pro ($700) is the flagship fixed-lens model this year, aimed at the professional (or semi-pro) videographer. It has upgraded frame-rate capture to 8K at 60 fps and 4K at 240 fps, along with an extreme "burst" slow-motion mode that hits 960 fps at 1080p. It also captures 4:3 "Open Gate" recordings at 8K/30fps and 4K/120fps, covering the entire sensor area, enabling more versatile editing and cropping across different screen sizes, including vertical video.

    GoPro Mission 1 camera series
    Steve Dent for Engadget

    Then there's the beastly Mission 1 Pro ILS (Interchangeable Lens System). It swaps the standard GoPro lens for a Micro Four Thirds (MFT) mount lens. It otherwise shares the same 1-inch sensor and high-speed 8K/60fps video specs as the Pro model. It also matches the Pro model's $700 price, with an additional $100 discount for GoPro subscribers. However, it won't be launching until Q3 2026.

    All of the Mission 1 Series accessories will be available on a rolling basis beginning May 28, with GoPro's own wireless mic system (take note, Rode and DJI) priced at $160. If you preorder a Mission 1 or Mission 1 Pro directly from GoPro now, you'll get the point-and-shoot grip bundled for free. The company still doesn't have an official release date for the cameras.

    This article originally appeared on Engadget at https://www.engadget.com/cameras/gopros-mission-1-camera-series-will-start-at-600-130044898.html?src=rss
  9. Blue Origin landed its recycled New Glenn booster but failed to put payload in orbitMonday, 20 April 2026

    Blue Origin has successfully reused its first-stage New Glenn booster for the first time after it landed in a cloud of smoke and fire on a recovery ship. It marks the second flight and reuse of Never Tell me the Odds, after the booster was recovered from New Glenn's previous launch in November last year. However, the rocket company's first commercial mission was marred by a failure to place the communications satellite payload into orbit. 

    The launch went smoothly to start with, with the first-stage GS1 booster separating from New Glenn after three minutes and landing smoothly 10 minutes after launch following two braking burns, as shown in a post on X from Blue Origin's owner, Jeff Bezos. 

    However, several hours later the Blue Origin team and satellite manufacturer, AST SpaceMobile, announced that the payload had failed to reach orbit. "We have confirmed payload separation," Blue Origin announced on X. "AST SpaceMobile has confirmed the satellite has powered on. The payload was placed into an off-nominal orbit. We are currently assessing and will update when we have more detailed information." 

    Later on in a press release, AST SpaceMobile revealed that "the satellite separated from the launch vehicle and powered on, [but] the altitude [was] too low to sustain operations with its on-board thruster technology and will de-orbited. The cost of the satellite is expected to be recovered under the company’s insurance policy."

    The upper stage was supposed to position the satellite into a 285 mile orbit after completing two burns. It would have then unfolded a 2,400 square-foot antenna and linked with six other satellites in a test for AST's high-speed direct-to-cell network. However, early telemetry data showed that the satellite only reached 95 miles, well below a sustainable orbit. It's not yet clear how the failure occurred. 

    Despite that, Blue Origin can take some solace in its successful first-stage reuse, particularly since it happened on just the third New Glenn mission (NG-3). It took SpaceX, by comparison, 32 flights before its first successful reflight of a previously flown orbital-class booster. 

    Blue Origin will definitely want to solve the upper stage issue soon. Its next flight is the first New Glenn launch of Amazon Leo (formerly Project Kuiper) broadband satellites. It plans to put 48 of those into orbit to significantly expand the Starlink rival's constellation, which currently sits at 241 satellites. 

    This article originally appeared on Engadget at https://www.engadget.com/science/space/blue-origin-landed-its-recycled-new-glenn-booster-but-failed-to-put-payload-in-orbit-055846419.html?src=rss
  10. The NSA is reportedly using Anthropic's new model MythosMonday, 20 April 2026

    Despite the months-long feud between Anthropic and the Pentagon, the National Security Agency is using the AI company's new Mythos Preview, according to Axios, which spoke to two sources with knowledge of the matter. Anthropic announced Mythos Preview at the beginning of April, describing it as a general-purpose language model that is "strikingly capable at computer security tasks." But back in February, Trump ordered all government agencies to stop using Anthropic's services after the company refused to budge on certain safeguards for military uses during contract talks. 

    The news comes days after Anthropic CEO Dario Amodei met with White House chief of staff Susie Wiles and other officials, reportedly to discuss Mythos. The White House later said the meeting on Friday was "productive and constructive," though President Trump said he had "no idea" about it when asked by reporters, Reuters reports. According to Axios' sources, the NSA is one of the roughly 40 organizations Anthropic gave access to Mythos Preview, and one said it's "being used more widely within the department" too. 

    The company is still embroiled in a legal battle with the US government. Anthropic filed lawsuits against the Department of Defense in two courts in March after the Trump administration labeled it a "supply chain risk," and the Pentagon filed a response shortly after. While Anthropic was granted a preliminary injunction by one court to temporarily block this designation, federal judges in the other denied its motion to lift the label. 

    This article originally appeared on Engadget at https://www.engadget.com/ai/the-nsa-is-reportedly-using-anthropics-new-model-mythos-211502787.html?src=rss