When it comes to AI, is the industry evolving as fast as the technology?

3.5k views1 Upvote5 Comments
Sort By:
Oldest
Director of Security Operations in Finance (non-banking)3 years ago
I applaud the progress made but we have to keep learning. If the software that’s produced devalues your worth based on your identity, you would consider that a significant emotional event. 

The challenge I have is that I often get that approach without an understanding of the guiding principles we're going to use: What are the do's and don'ts that we will use until we have those guiding principles and what will we do to create those checks and balances? I see a lot of eagerness to put out AI because it exists and not much understanding of the potential impacts or wanting to examine them afterwards. That used to be the model that many less mature organizations had around security—that’s changed, but we need to make sure that change exists with AI.

Let's keep learning and failing fast but let's also understand what the principles are regarding use, as well as the checks and balances we'll put in place while we learn. I'm not hearing much discussion about that from the same companies that have been working on AI principles for the better part of a year, yet they already have projects in flight for utilizing AI in their products. We need to get in front of that principle-based usage, or we'll end up both slowing AI from a reactionary standpoint—from people who don't understand it—and causing problems from which we won’t be able to recover until people or organizations have been victimized. That concerns me.
President and National Managing Principal in Software3 years ago
You've got to fail fast and learn it, but I don't see anybody doing that. That's the problem. Companies invest in AI, they get some programmers and data analysts and think it's great. But the very few that have been successful are the ones that are running it and having daily reviews to check: Is this correct? Are we doing it the right way? Can we adjust the knobs? 

All of the academic literature that's out there forcefully says that human involvement is a necessity for AI to be successful because we need to have the ability to turn the knobs up or down to recognize when there's bias, etc. But that's the problem we've been living with in tech for eons: If something looks cool then we'll just buy it and put it out there, as if it's just going to work without us having to work at it.
1
CEO and Co-Founder in Software3 years ago
Today, 99% of your ML and AI are black boxes. So even the researcher doesn't truly know what happens in the actual transformation. All they know is: this is my input, these are my parameters in tuning and this is the desired output. 

We do heuristics: we keep running thousands of simulations to get to the desired output and then freeze that model. If there’s any simple change to the model, we have to redo the whole thing. That's why ML and AI are very expensive. When you talk to the researchers, they can say that it's continuous, adaptive, etc., all they want. But the folks who really do it will never speak out because the reality is that it’s a black box and we do it by heuristics, which doesn't sound fancy by any stretch.
2
lock icon

Please join or sign in to view more content.

By joining the Peer Community, you'll get:

  • Peer Discussions and Polls
  • One-Minute Insights
  • Connect with like-minded individuals
Board Member, Advisor, Executive Coach in Software3 years ago
Despite all the marketing hype, the reality of AI in the global security market space is that it’s about 10% of the spending, regardless of what people say they're doing. In 2020, Forbes reported cybersecurity spending was $123 billion in total. But when you look at the global AI and cybersecurity marketplace in some studies that were done, the total spending around AI and the market size in late 2019 and early 2020 was about $9 billion.  

If you look at all the marketing, 90% of people say they're doing AI and ML. But when you look at the financial numbers, it's less than 10%. So there's a lot of hype there and it might be that they have some models in the early stages, or maybe they're not really doing AI and ML. They may just have an expert system that they're labeling as something else. That's where the context matters.
6
Director of Project Management23 days ago
 - Interesting to see "a year ago" vs "4 days ago perspective" responses to your "When it comes to AI, is the industry evolving as fast as the technology?" ask.  Looking forward to the iterations here and on other "smart" dialog.  Kind regards, Peter Ross

Content you might like

Yes35%

No54%

Planning to do10%

View Results
3.4k views1 Upvote

Cost31%

Repeat Issues44%

Response Time17%

Customer Service7%

View Results
3.9k views1 Upvote1 Comment
IT Manager in Constructiona month ago
Hello,
the topic is so broad, what are you focused on?
Read More Comments
3.4k views2 Upvotes4 Comments
Senior Director, Technology Solutions and Analytics in Telecommunication3 years ago
Palantir Foundry
3
Read More Comments
11.1k views12 Upvotes49 Comments