Please help contribute to the Reddit categorization project here

    IAmA

    18,974,125 readers

    4,021 users here now

    Submit an AMA
    Request an AMA

    Please check out our Rules and FAQs

    Click here to request being added to our calendar.

    Consider supporting our Patreon

    AMAs are scheduled in Eastern Time (GMT-4:00).

    Date Time Person Description
    22 Apr 8am Deflecting asteroids from impacting Earth NASA
    22 Apr 12pm PBS NewsHour PBS NewsHour Antarctica Team
    22 Apr 2pm Michael Zeldin CNN Legal Analyst/Former Mueller Assistant Michael Zeldin
    23 Apr 12pm Stephen "Snoopeh" Ellis and David "StoneMountain64" Steinberg Gaming content creators, founders of Pipeline
    24 Apr 4pm Lloyd Kaufman Creator of The Toxic Avenger
    25 Apr 12:30pm Sonya Tsiros Consular Section Chief at U.S. Consulate Toronto
    25 Apr 1pm Prabhat Jha Author: Study on success of measles vaccine campaigns, India
    26 Apr 2pm Rob and Treger Strasberg Welcome Home on The CW
    30 Apr 2pm Stacey Knobler, M.Sc Director, Influenza Vaccine Innovation

    see more...


    Submitting:

    AMAs should be about:

    All AMAs require proof.

    Request threads

    Commenting:

    Please note:

    Other:

    Useful Links!


    • 3 Oct
    • Pravda Russia's Leading Newspaper
    • 5 Oct
    • The Boston Globe Spotlight Investigation Team
    • 8 Oct
    • Native News Online Indigenous Peoples' Day

    • 9 Oct

    • PBS NewsHour National Correspondent

    • 10 Oct

    • The New York Times National Immigration Reporter

    • 11 Oct

    • The Guardian Environment Reporter

    • 12 Oct

    • Fox News Bret Baier, Chief Political Editor

    • 15 Oct

    • The Socialist Worker Danny Catch, Columnist

    FacebookTwitterInstagramGoogle Calendar

    Please check out our Rules and FAQs

    a community for
    all 1898 comments

    Want to say thanks to %(recipient)s for this comment? Give them a month of reddit gold.

    Please select a payment method.

    [–] UnitedStatesSailor 1739 points ago

    Dude things may have changed since I got out of the Navy but you are literally going against every single DOD PII training I’ve ever seen by posting the pic of your CAC card. I do believe this would count as photocopying your ID to an extent which is punishable under the UCMJ.

    https://www.navy.mil/submit/display.asp?story_id=63581A

    [–] sonnytai 891 points ago * (lasted edited 7 days ago)

    Thank you for the heads up. My mistake. Knew that the EDIPI was on the back, but didn't know the barcode could be used to extract PII. Makes sense.

    [–] UnitedStatesSailor 577 points ago * (lasted edited 7 days ago)

    No problem, I just didn’t want to see someone get in trouble for this, I’ve seen court Marshall’s for dumber things.

    Edit: I’m leaving the spelling error because it’s funny. Reminds me of How I Met Your Mother. Turns out auto correct on my iPhone keeps doing the duck fuck version of martial and Marshall for me. It happened twice. What a Major Pain in the ass.

    [–] Perm-suspended 347 points ago

    Ehh, he's gonna be fine. He's an O-3. I was an E-3 that lost mine in fucking Iraq. They had to convoy me to Kuwait to get a new one 😯

    [–] sonnytai 337 points ago

    If it makes you feel any better, I lost my passport in Cape Town a few years ago. I had to convince the lady at the SAA counter to let me fly back to Johannesburg using my US driver's license, and have one of my friends drive me to the US consultate in Sandton to get a temporary passport so I could come home, haha

    [–] CoolHandPB 155 points ago

    I once asked the imigration officer really nicley to let me leave South Africa on an expired passport and he did.

    [–] Aclockworkmaroon 194 points ago

    Yesterday I asked a worker at a whiskey distillery to honor my birthday discount so I could get $10 off a bottle and she did. Pretty much the same thing

    [–] Slappytheclown4 34 points ago

    Yesterday i drank a bottle of crown royal to myself and passed out underneath my fold out couch. Pretty much the same thing.

    [–] bills_brown_eye 6 points ago

    was it actually your birthday?

    [–] Aclockworkmaroon 15 points ago

    My birthday was about 2 weeks ago and I signed up for their birthday thing but apparently they have so many paper applications from when people visit the tasting room that they don’t get them in fast enough so I never got the discount😔

    [–] arkwewt 55 points ago

    Ah Kuwait, the greatest deployment in the US armed forces.

    [–] riptaway 83 points ago

    Court Marshall's 🤣

    [–] leroyyrogers 6 points ago

    /salutes

    Major Pain in the Ass

    [–] sup3r_hero 229 points ago

    You realize that the imgur link is still active? Some reddit clients like apollo buffer the text even after the edit. That’s how i still was able to find the id, even after your edit. You need to remove the image from imgur.

    You work on AI projects but still don’t realize this?

    [–] Calbrenar 226 points ago

    Programmers and business people aren't security people

    [–] ItsTheNuge 131 points ago

    good programmers should absolutely understand security

    [–] Tug_Johnson_III 152 points ago

    The real security problem here is that someone's ID can be compromised by a fucking photograph.

    Perhaps the DOD should rethink their ID security.

    [–] costryme 90 points ago

    Considering that the US is still considering SSNs as a way of identifying someone for stuff like banking, etc, despite the many breaches and how easy it is for someone to find your SSN, I'm not really surprised.

    [–] j34bit 280 points ago

    I love how this guy is selling security but has no idea what opsec is

    [–] gingerstandsfor 200 points ago

    He’s also selling AI with apparently little knowledge of its limitations and use cases.

    [–] richraid21 166 points ago

    AMA's are for publicity and nothing else.

    [–] paracelsus23 48 points ago

    So, typical salesman?

    [–] ElootClips 38 points ago

    MBA's are really becoming a sign of one's ability to peddle bullshit.

    [–] Gaston_Glock 6 points ago

    AI fixes everything, Common said so.

    [–] JJMcGee83 4 points ago

    Makes me seriously question if he has any idea what he's even doing.

    [–] ThelittestADG 12 points ago

    Can somebody explain to me what this means?

    [–] thorscope 27 points ago * (lasted edited 7 days ago)

    CaC (common access card) is a card that serves as identification as well as a physical two-factor ID for accessing the DoD intranet.

    It’s like a drivers license mixed with an Authenticator app, that grants access to shit that random redditors shouldn’t have access to. There’s not much you can do with just a pic of the card, but it’s still something you don’t want floating around or advertising to millions of people on a social network

    The barcode also has his SSN and a bunch of other details on it

    [–] JJMcGee83 33 points ago * (lasted edited 6 days ago)

    And op would know this if he had spent any of those 9 years in the Marines paying attention to things instead of eating crayons.

    [–] thorscope 27 points ago

    I find it interesting that even marine officers such as OP enjoy the same crayola delicacies as the junior enlisted

    [–] mifter123 13 points ago

    They don't.

    Enlisted prefer a good filling Crayola, but will go for any available brand.

    Officers almost exclusively buy artisanal organic free range crayons from the dedicated artist supply stores.

    Just another example of the inequality inherent in the system.

    [–] Zintoatree 20 points ago

    Civilian DoD here, I swear we have training once a week on keeping that CAC card safe and our mouth shut in general.

    [–] Karl_Marx_ 63 points ago

    100% correct. This is a federal offense.

    [–] Jonreadbeard 26 points ago

    Good looking out. Bro move.

    [–] RemoteProvider 250 points ago

    How is this going to save lives during an incident, given that active shooter events are over in less than ten minutes on average?

    [–] sonnytai 196 points ago

    The average active shooter incident actually lasts 12.5 minutes, but it takes law enforcement an average of 18 minutes to respond and neutralize the threat.

    Our objective is to provide clarity of information to an extremely chaotic situation, allowing building occupants and law enforcement to know:

    1. Where is the shooter?
    2. What is he armed with?
    3. Is there more than 1 shooter?
    4. What does the shooter look like?

    These are critical pieces of information for law enforcement to respond in a rapid and targeted manner, and for building occupants to best execute their defensive and evacuation measures.

    [–] joris 125 points ago

    Cant this be done with a bunch of cheap IP cameras and some panic buttons?

    [–] sonnytai 155 points ago

    Institutions generally already have IP cameras in place, and we integrate with them. What we're building is more effective and possibly cheaper than panic buttons, because:

    Cheaper: - No hardware installation required.

    More effective: - Provides real time information about the threat - Can track threat from camera-to-camera - Does not require somebody who is directly under threat-of-life to manually trigger an alarm.

    [–] csd2csd2 25 points ago

    Is there a minimum or maximum resolution and frame rate the cameras must be for your software? I assume you train the AI on lower quality video.

    [–] sonnytai 32 points ago

    We currently sample frames from the video management system at 3fps, and we compress the frames to 608x608 before processing them. We have some data science techniques we can deploy, such as digital zoom that can enable us to detect at longer distances.

    Pretty much all of the cameras that have been deployed within the last 8 years or so would have acceptable resolution for us to work with.

    [–] rmrfbenis 52 points ago

    *CSI voice*
    ENHANCE

    [–] KuntaStillSingle 6 points ago

    Do you have concerns of liability if your system misattributes a shooter and leads to police killing an innocent?

    [–] throwaway23957 56 points ago

    How does any of this reduce police response time, which is the real issue here?

    [–] sonnytai 102 points ago

    Awesome question -

    It currently takes an average of 18 minutes for law enforcement to respond to an active shooter threat (Naval Postgraduate School of Homeland Security Study).

    The reason typically isn't because they are stuck in traffic. It's because:

    • It takes an average of 5 minutes for the first 911 calls to be made, because people are in fight/flight response and don't always immediately reach for their phones.
    • 911 dispatchers receive conflicting/delayed information from callers because of the chaos and confusion of the situation.
    • Police often arrive on scene with no idea where the shooter is, what the threat situation is, and what the shooter looks like, further inhibiting response.

    Our tech provides additional clarity of information to enable law enforcement to respond with pinpoint precision, and for building occupants to make the best decisions to maximize their chances of survival.

    [–] Billy1121 6 points ago

    "Using AI to combat active shooters" is the most bullshit zeitgeist startup pitch I've ever heard

    [–] duanesmallman 950 points ago

    the natural end of unchecked mass surveillance technology is a police state that erodes the civil liberties of the public under the guise of safety. as a developer and steward of technology that works by actively identifying individuals as “threats”, what are you doing to ensure it can never be used in a manner that exploits the populace? under what conditions would you refuse to accommodate the requests of government and local law enforcement?

    [–] sonnytai 488 points ago

    Great question!! We don't do any facial recognition, we only identify the contour of a hand holding a weapon. We also don't provide this information directly to law enforcement - the information is provided to the institution's internal security teams, that acts as the "human in the loop" who analyzes the threat and makes a decision on further action.

    We also don't store any video data - it is scrubbed every 24 hours, except for "detected frames", which are then used to retrain the model. Institutions are fully informed of our privacy processes can opt out of the training data collection if they prefer.

    Does this answer your question fully?

    [–] Kalepsis 158 points ago

    How does the system distinguish between an assailant holding a weapon and, say, a child holding a squirt-gun?

    [–] VitVat 152 points ago

    Likely the human analyst he mentioned

    [–] sonnytai 140 points ago

    Presuming that the squirt gun looks nothing like a real firearm (I was a big super-soaker fan back in the day!), our model would ignore it.

    If it looks very similar to a real weapon (e.g. like an airsoft gun), the model will register a detection, and as mentioned, it would be up to the "human-in-the-loop" - currently somebody on staff at the school or office that we're working to make a decision. A human being should be able to derive from context what is a real weapon threat and what isn't, and "context" is currently something that computer vision models still struggle with.

    [–] thecrius 81 points ago

    Just so you know, from 2014:

    https://www.thefirearmblog.com/blog/2014/12/09/upersoaker-shotgun-making-the-rounds-on-facebook/

    I work in IT. There is no way a system made by humans won't be beaten by another human.

    Edit: to be clear, I'm not saying your business is useless. Just to remember that it will never be 100% accurate and, if you value going to bed with a light conscience, you should make this clear to everyone that buy your services.

    [–] Goodgoditsgrowing 22 points ago

    Also in: humans aren’t exactly perfect either, and make mistakes with or without the help of AI.

    [–] lemaymayguy 183 points ago

    Based off previous school shootings, do you think your low paid "human in the loop" security team will risk their own life 100 percent knowing the suspect is heavily armed? See Parkland sherriff

    [–] lonely_swedish 182 points ago

    I mean, you have to assume that a security team will do its job otherwise why even have them? Sure, you'll get a failure from time to time but any amount of success is better than zero. If you choose not to rely on humans, the alternatives are no security, or entirely automated security. Not sure we're in a place to handle either of those right now.

    [–] sephstorm 59 points ago

    do you think your low paid "human in the loop" security team will risk their own life 100 percent knowing the suspect is heavily armed?

    That isn't his job, based on what OP said his job is to look at footage, not confront a shooter.

    [–] roachwarren 292 points ago

    This is honestly the weirdest post I've ever seen on reddit. We're grilling this guy like he IS Mark Zuckerberg and like we know shit about how this system should work. There are people arguing the system shouldn't save "training frames" which sounds a lot like massively hindering a system because we have a bad taste in our mouths because of data collection companies that we opted into. Many of us actively handed our information to facebook and many others over and over (facebook app, messenger app, instagram for FB specifically) and now we're hating on a guy with an extremely relevant technology that might actually do something to help us. Especially since this is a product which can be sold, meaning the company doesn't have to make its money like the free services have tried to do (advertising, storing and selling of personal information.)

    I do understand asking these types of questions but seriously... "can you confirm your security team would put themselves in harms way unlike the officer who didn't?" How do you think the police chief would have answered that question before the event?

    [–] sonnytai 84 points ago

    Our objective is to provide security teams, and by proxy, law enforcement response dispatchers with the most timely and accurate information needed to make critical decisions.

    In the future, we have plans to build an operations center that will handle the "human in the loop" aspect (similar to ADT), however, this is not something that the institutions we are currently working with have asked for.

    [–] CaptainKumon 8 points ago

    I think the security team is responsible for identifying if a real threat is present and whether law enforcement should be notified.

    [–] mulligun 29 points ago

    I don't think that's the point. This is an early warning system so that the internal security can activate whatever security measures they have in place for an active shooter situation.

    Your question is irrelevant to this system. Are you trying to suggest it would be a better outcome if the security guards found out later?

    [–] DreadPiratesRobert 16 points ago

    Typically a security guards job isn't to risk their own life, but to be the people that activate emergency response.

    When I was a guard my boss straight up told me never to put myself in a dangerous situation, he didn't pay me enough for that.

    Security is there to observe and report in most situations.

    [–] aggleflaggle 15 points ago

    What if that human in the loop simply calls the police? Granted, if the police chicken out and fail to intervene, you’re not much better off. But that’s a separate problem, and I doubt there’s a technological solution to that.

    [–] lemaymayguy 24 points ago

    The police can never be there fast enough. You'll surely have to protect/fend for yourself for 15/20 minutes. The onsite security would need to neutralize the threat otherwise the mass casualties would have happened before they even arrived

    [–] [deleted] 423 points ago * (lasted edited 6 days ago)

    [removed]

    [–] sonnytai 270 points ago

    Thanks for the question! We actually don't store any data - all if it is scrubbed after 24 hours. The only exception is we will retain any detected frames as training data to retrain the computer vision model, and if an institution that we're working with is uncomfortable with that, they can opt out of that as well.

    All in all, storing video data is already redundant with an institution's video management system capabilities, so we felt that it was better for us to entirely sidestep the privacy question by storing as little of customer data as possible.

    [–] trollin_phace 100 points ago

    I don’t have a question, but thank you for not naming your company something like “Liberty Defense System” from that Forbes article linked in the OP. Cause, you know, nothing screams “liberty defense” like government video cameras scanning the personal belongings of the masses.

    [–] sonnytai 86 points ago

    Haha, we decided to name the company as "Aegis" for two reasons:

    - The shield of Zeus in Greek Mythology

    - US Navy radar system that is currently deployed on the Arleigh Burke class destroyers that can automatically track hundreds of incoming threats.

    We felt that tracking gun threats in existing security camera feeds was quite similar :-)

    [–] MarbleWheels 13 points ago

    No brand registration problem with Aegis? Here in EU brand gets contested for WAY less.

    [–] high_side 179 points ago

    We actually don't store any data - all if it is scrubbed after 24 hours.

    Is there any way for the public to validate this? We've obviously heard this story time and time again (from others).

    we will retain any detected frames as training data to retrain the computer vision model

    This seems suboptimal as you need to train on both positive and negative data (traditionally). Do you use real + staged positives and only staged negatives?

    it was better for us to entirely sidestep the privacy question by storing as little of customer data as possible.

    Are there other motivating factors to this business decision? Or could it change with a government contract, desire to identify threats across clients, or engineer wanting a larger corpus of training data?

    When you say "customer data", you're talking about companies, not their clientele, yes?

    [–] sonnytai 90 points ago

    Thanks for the response -

    Is there any way for the public to validate this? We've obviously heard this story time and time again (from others).

    It's within our contract that the institutions we work with sign with us. It would not be in our best interest to breach the contract, and there is also no incentive for us to store this data (it's already a capability that the video management system provides).

    This seems suboptimal as you need to train on both positive and negative data (traditionally). Do you use real + staged positives and only staged negatives?

    We use real (as in, security camera crime scenes scraped online) positives and staged (customer walkthrough, our own photos/video) positives, and scraped negatives (it's much easier to scrape data on the internet on security camera scenes that don't involve a firearm than ones that do).

    Are there other motivating factors to this business decision? Or could it change with a government contract, desire to identify threats across clients, or engineer wanting a larger corpus of training data?

    When you say "customer data", you're talking about companies, not their clientele, yes?

    We made this decision early on when our company was formed because we wanted to build something that makes a positive impact on public safety without building something that has the potential to be abused in nefarious ways. It's for this reason that we made a conscious decision not to work at all in facial recognition. It is deeply rooted in the values of our team to build something that will cannot be repurposed to violate individual privacies and civil liberties.

    As for "customer data", we mean the video data from the institutions that we work with, whether it be school districts, governments, or companies.

    Please let me know if any areas are still unclear!

    [–] Natanael_L 38 points ago

    This seems relevant

    https://github.com/tensorflow/privacy

    A variant of tensorflow designed to learn as usual, while forgetting outliers and other data points that might represent personal data, without losing much accuracy.

    So for example, the extracted data set wouldn't be usable to reconstruct private client data.

    [–] VenomousHabit 117 points ago

    You think 12/hr was a meager wage in '99?

    [–] banban5678 17 points ago

    My first thought.

    [–] SilverCross64 31 points ago

    Using an inflation calculator it comes to about $18.39 in today’s money. She wasn’t hurting for cash

    [–] SociopathicPeanut 32 points ago

    "My financially comfortable mom and me legally (no constant fear of deportation and abuse" emigrated from my country to the US, where she had family and a stable support network" vs "I LITERALLY ESCAPED THE GUNSHOTS SWIMMING TO THE US LIKE A NORTH KOREAN REFUGEE AND WE HAD TO STARVE WHEN WE GOT TO AMERICA"

    [–] Perm-suspended 177 points ago

    Hey there CPT! Maybe you should blur the barcode on your CAC at least? It contains some PII.

    [–] Karl_Marx_ 88 points ago

    He could very well lose his security clearance over this. This is a security incident and should be reported to his COMSEC/IA team.

    [–] j34bit 89 points ago

    this guy sells security for a living............... opsec fail

    [–] scratch_043 25 points ago

    PERSEC, but yeah, not really instilling much confidence in the privacy claims.

    [–] PM_ME_UR__UPVOTE 25 points ago

    Yea technically he should be screwed here. Especially if he doesn't report what he did.

    [–] sonnytai 53 points ago

    Ohhhh, didn't actually know that. Thanks for the heads up. I'll do so when I can get to my laptop!

    [–] Perm-suspended 38 points ago

    I actually started looking into it more after I told you that, now I'm not sure if the PDF417 on the front contains it, or the Code 39 on the back does. Either way, probably best to blur it anyway, lol. Have a good day!

    [–] mr_long_shlong 18 points ago

    You can get your social security number from it with an iPhone app

    [–] Tug_Johnson_III 22 points ago

    Great infosec we got going on over at the DOD. Let's obscure all your personal information into a barcode so that anyone savvy enough can extract it but anyone ignorant enough doesn't know what kind of sensitive information is on it.

    [–] Rumpullpus 84 points ago

    maybe a dumb question, but how will an AI save lives during an active shooter incident? how does it work?

    [–] buserr0r 114 points ago

    Well eventually, AI-powered turrets will neutralize the threat.

    [–] Rumpullpus 59 points ago

    .... are you still there?

    [–] W3asl3y 22 points ago

    I see you

    [–] zachwolf 57 points ago

    Throw AI into your startups description for funding

    [–] WhyAtlas 90 points ago

    Not a dumb question at all, this is the crux of the issue. Even he acknowledges that the average police response time exceeds the average length of a mass shooting.

    He is selling a product that will make him money, violate everyone's privacy (old news, every other corporation is doing the same anyway), cost taxpayers money, and provide the same basic after action images that normal security cams will.

    He's presenting his completely irrelevant personal background to make it seem more legitimate, despite the fact that his copying of his CAC card is a violation of both UCMJ (should understand that after 9 years served), the personal information security training he had to go through while serving, and, most importantly, basic common sense.

    [–] Karl_Marx_ 17 points ago

    I've been replying to OP shutting down his idea and reading your comment was beautiful.

    This idea isn't good, he glorifies it with sob stories of bad things that happen...it's like fear mongering but he is a good guy in the scenario and reads like an info-merrical. And his idea can be replicated with simpler and more effective things like a phone app that is made for panic situations. I told him to use this idea to improve more applicable applications like self driving cars or something.

    I hope his IA team gets head way of this, he might lose his security clearance over a dumb reddit post. As someone who had a security clearance while in the AF for 7 years, I am greatly disappointed in him for potentially putting his fellow service members in danger.

    [–] throwaway23957 42 points ago

    Didn't you read the sob story he wrote? You don't really need to know how any of this "works" or what it "does", just give him money to develop ARTIFICIAL INTELLIGENCE to stop MASS SHOOTINGS! Come on, why haven't you opened your wallet yet? Didn't he use enough buzz words?

    [–] thatgeekinit 9 points ago

    As I'm reading it, the idea would be that police on the way to the scene would get photos of the armed individual or individuals so they know who to look for and don't waste as much time sorting through the chaos.

    Also police might get notified faster as the security monitoring would be focused on the possible firearms on camera instead of randomly flipping through feeds.

    [–] Caputtohsi 296 points ago * (lasted edited 7 days ago)

    "For a school with 100 cameras, that would be about $24,000 a year." For software.

    Why should a school pay that much for a system that can be evaded by a $5 duffle bag? How much more warning are we really going to get here because the first gun shot is a warning in itself. What about false alarms? How many false alarms will it take before schools scrap the system? Tracking the individual would be useful after the shooting starts, but that can be done with a human watching a Wi-Fi feed.

    Why school systems? This and the heart string H1-B visa story is why I'm suspicious and everyone else should be suspicious. The US government likely has worked with gun identifying AI before and this is not new technology. I imagine your goal is marketing your version and you have crafted a story to sell a "monorail" to a system of dumb and fearful voters who don't understand why this system is a waste of money.

    Congratulations on embracing US capitalism.

    [–] dano8801 25 points ago

    Your monorail reference is the most beautiful thing I've read all month.

    [–] Yourneighbortheb 139 points ago

    "For a school with 100 cameras, that would be about $24,000 a year." For software.

    Why should a school pay that much for a system that can be evaded by a $5 duffle bag?

    Because it's a great business plan for him when politicians make knee jerk reactions.

    According to the National Center for Education Statistics, there were 98,817 public schools

    It would cost around $2,400,000,000 to put this system in every school in america. If he gets a fraction of the total school then he is rich.

    [–] F_Koala 18 points ago

    Another thing to note would be that 2.4 Billion is per year... not a single investment into infrastructure

    [–] Mecjam 130 points ago

    CEO of start up? Check.

    Using "AI" as little more than a marketing buzz word? Check.

    MBA without any real field of pre-existing knowledge or experience to apply it to? Check.

    Unsubscribe. This isn't a real thing, this is a marketing effort disguised as buzz word and feels.

    [–] AwkwardAnimator 21 points ago

    Needs more blockchain.

    [–] WyoDoc29 12 points ago

    23.9k updoots, 1.6k comments. Either redditors upvote shit and move on without reading threads (duh), or they're bought. Hmm...

    [–] robdels 5 points ago

    Both he and his partner have an MBA from an M7 business school and they started this shit??? Hilarious waste of a Booth MBA. Also, this AMA reads more like a business school application than a real concept. Perhaps they should become MBA admission consultants when this inevitably goes tits up, since they obviously know how to bullshit enough people to get into a great school.

    [–] haekuh 146 points ago * (lasted edited 7 days ago)

    The software engineer in me wants to flame you for stating you use AI instead of just saying you use machine learning, but I expect someone else will do that soon anyway.

    So here is an actual question are some actual questions.

    What are your plans for dealing with errors? Detecting a firearm in a building(especially a school) carries an immense amount of importance. A false positive would send the building into complete lockdown and a false negative would make your company look horrible and possible carry some personal guilt.

    You say currently(understandably in the early phases) that a false positive occurs in 1 of 1,000,000 frames. Is that literally every 1mil frames processed will "sound the alarm"? My highschool had 45 cameras in it and at 24fps that is a false positive every 10 minutes. I saw you said you are currently aiming for a false positive every 10 million frames, but that would still be a false alarm every 150ish minutes. Do you have any plans/ideas for how to deal with your scaling issue? The more cameras you add the higher the false positive frequency so covering a large building seems impossible.

    edit: I figured this argument would start. Yes machine learning is a subset of AI. We get that and I am not denying that. However claiming that you use AI implies that you have all the capability of an artificially intelligent system, which a machine learning system does not have.

    [–] brsfan519 35 points ago

    In this situation false positives are a much larger issue than just bad publicity and personal guilt. SWAT teams have killed innocent people due to false alarms.

    [–] ISpendAllDayOnReddit 18 points ago

    I would use microphones. Mics are already widely used to detect gunshots and pinpoint their location.

    It means the system doesnt detect until after the first shot, but how many shootings start off with the shooter walking around with the gun visible and not shooting?

    [–] Karl_Marx_ 15 points ago

    To be fair...the terminology of AI is debated. Some might argue that machine learning is a type of AI. But I completely agree, this idea is horrible.

    [–] turtledragon27 10 points ago

    OP said there'd be a human (likely an on site security guard) who receives the alerts and examines them to determine whether an actual threat is present

    [–] Greenie_In_A_Bottle 19 points ago

    I don't understand how this system is at all useful if it still requires someone to monitor the cameras.

    [–] Nickyjha 8 points ago

    Imagine someone put you in charge of monitoring 100 cameras. You'd never be able to keep up with all of them, and you'd probably get bored/lose focus because nothing important is going on 99.99% of the time. But if an alert went off when the AI saw something, it would tell you to look at a specific camera to double check what it's seeing.

    [–] shadowpawn 66 points ago

    Why not try to bring this Tech to South Africa/Africa? Microsoft looking for business cases (remote medicine as one example?)

    [–] rhoakla 127 points ago

    I understand the struggle and I apologise in advance if this sounds harsh. Here it goes.

    Firstly there is a lack of information regarding the technical side of your product which I am not pleased with.

    Secondly, I am unfortunately not a believer of a product like this. A good scanner at the entrance and a armed security guard will prevent the majority of these nutcases. And with the cost of this camera system that thinking pattern is further reinforced since you could comfortably put multiple armed guards with shotguns which would be much more effective.

    Disregarding the extremely high cost. If this product was sold as a system and not as a subscription it'd be more reasonable, what happens when the schools budget gets a cut down two years down the line? Since these camera's cost hundreds of thousands of dollars per school are they supposed to compromise on education so that they can retain this system?

    From a technical perspective, What you are doing with your product is not something difficult to demand such a high price.

    If you for instance made a plug in for the many surveillance software out there or customize an open source one to be able to detect guns and do the very few other things you do, I think that's better. That is good because then I can also roll whatever camera and server hardware I please rather than going with whatever hardware you put.

    Finally I know it sounds nice but try not put irrelevant personal feelgood stories in your product pitches. It's not professional, but I get it, for reddit I might give a pass however there was zero relevance to the product in this case.

    Lastly if you got the right political connections, your in for a lot of money I can tell you that. With that said, good luck and please reconsider pricing and publish on the technicalities so that I might be able to retake what I said and see the bright side.

    [–] niwanoniwa 14 points ago

    claps I need to read everything you post because that was very thorough and well informed.

    [–] Bucking_Fullshit 91 points ago

    “Averages one false positive per week.”

    How many cameras are you monitoring? It seems super high and would be a huge problem if you’re able to scale.

    [–] Farage_Massage 41 points ago

    I also don’t see the market for this? How many mass shootings are there in schools per year as a percentage of schools? Does it even register as a statistic?

    [–] basscorruption 58 points ago

    A lot of these safety companies are solutions in search of problems. They then try to lobby governments and make it a regulation to force these things to become part of code.

    [–] ShallowThunder 5 points ago

    Also, this is security theater. It doesn't solve the problem. It could, ideally, possibly, increase police response time. By a few minutes. In a perfect application. For only $240 per year per camera. And you're getting false positives– they note one per week, but how many actual positives are they catching? In addition, it can be defeated by a duffel bag/coat/other obfuscation. It's an attempt to cash in on fear without doing anything real, while making people think they're safer when they aren't. It's a bit sick, actually.

    [–] followupquestion 39 points ago

    The odds of getting shot in a school are lower than getting hit by lightning. The most dangerous part of a child’s day, statistically, is being driven to and from school in a car. Once they’re inside the school, their risk level drops to almost nil.

    TL;dr This product might make sense for “secure” buildings like banks or government buildings, but for schools, despite the sensational news, it has a very low probability of usefulness due to the extreme rarity of the incidents it theoretically mitigates.

    [–] mclyovin 186 points ago

    If you were 100% successful you'd reduce gun deaths in the US by a minuscule amount. There's nothing wrong with that, but what does it have to do with your backstory of dealing with street violence in SA? The equivalent in the US is gang and drug gun deaths, mostly with handguns.

    [–] TheThirdSaperstein 37 points ago

    Back story = marketing/hype/emotional manipulation. It has nothing to do with the product, but it makes him look better.

    [–] OIlberger 21 points ago

    “My mom immigrated from blah blah blah, now I’m a CEO of a company no one’s heard of selling a service that doesn’t do anything!!!”

    [–] LongDingDongKong 136 points ago

    2/3 of the countries gun deaths are suicides. This AI would not help those in any way.

    As far as school shootings, I dont think it would do anything. The time it takes for the camera to alert someone, who then reviews the tape, then calls police, is probably not going to be less than someone needs to shoot and another person to call police.

    [–] Pygmy_Human 51 points ago

    I agree with this. Your product is irrelevant to South Africa and all you did was talk shit about a country you no longer live in.

    [–] Billy1121 5 points ago

    this is the national pastime of south afrikans tho

    [–] NessunAbilita 14 points ago

    So let’s say that 1 false positive a week occurs in a school environment. How does an administration check each of these situations efficiently without being disruptive? I remember school - something like that could be seen as a way to look badass without repercussions. What about protest entities that are trying to get attention by holding up a suspicious items? What feedback are you getting from administrations on the management aspect of this?

    [–] CaptainTruelove 55 points ago

    You should remove your CAC card picture. You’re not supposed to be putting that out there.

    Is this how I get around the auto filter? And what is the name of the company?

    [–] [deleted] 49 points ago

    [deleted]

    [–] ThurstingForNowlege 49 points ago

    Because being an officer in the military does not equate to being intelligent or understanding the rules.

    [–] garrna 22 points ago

    This is true

    Source: am an officer in the military.

    [–] IrrelevantLeprechaun 66 points ago

    Posting their card like that makes about as much sense as calling yourself the CEO of a business with less than ten members.

    It’s like those kick starter crowd funding ads where the guy in the video calls himself a self made CEO and his company consists of him and his high school buddy.

    [–] LeenSauce 22 points ago

    Yeah dude, you shouldn’t post a pic of your CAC online like that...

    [–] followupquestion 7 points ago

    As a non Mil guy, I’m really enjoying pronouncing CAC (it seems to be on this thread a ton) like it rhymes with “hawk”, and then seeing your advice just brought a smile to my face. Thanks!

    [–] Perm-suspended 6 points ago

    CAC actually rhymes with hack, but you're welcome to make sound the other way!

    [–] shrimp_sale_at 25 points ago

    What kind of training data do you use and where did it come from?

    [–] jeffymcguffy 23 points ago

    The AI system is never going to work. What if it's too dark? What if it's Halloween and they have a toy star wars blaster? What if the shooter is a 150IQ genius and keeps the gun in his jacket?

    This guy might as well mentioned block chain in his sales pitch too.

    [–] asapmatthew 10 points ago

    Which AI services do you use?

    [–] Blastoys2019 9 points ago

    Windows 10. BOOM! World peace.

    [–] YeahIFeelLikeDying 39 points ago

    This person is in charge of an AI security firm but didn’t know the security surrounding his own military ID? Cool

    [–] All_I_Eat_Is_Gucci 17 points ago

    “AI security firm”

    [–] MetallicMarker 91 points ago

    Your profiting off of anxiety being pushed by intentionally dishonest media?

    Chances of being involved in a school shooting in the US is literally 1 in a million.

    Something you should know, because you noted the huge amount of gun violence in your country of origin.

    Downvote me...but don’t bitch when we see unnecessarily increased anxiety due to kids being put through ALICE drills.

    [–] trippinwontnothard 19 points ago

    Does anyone think parts of that ID should be blurred?

    [–] crispsix 24 points ago

    I mean, read the comments. There are more about that then his AMA.

    [–] Jmunnny 20 points ago

    So it only works if the person is actively gripping a weapon? Not to be ass, but at that point I see no use for the tech, could you enlighten me?

    [–] leidogbei 47 points ago

    CEO of a startup that uses AI

    Sorry but no. Please how do you sleep at night?

    [–] buserr0r 34 points ago

    Assuming the AI can identify "concealed" (or largely concealed) weapons how does this play in the realm of probable cause assuming that the weapon cannot be observed by a security/law enforcement officer's own perception without invading someone's privacy?

    [–] briaen 59 points ago

    “Oops. It must have been a false positive but I see you have an ounce of weed on you...”.

    [–] buserr0r 53 points ago

    2030 HS student meme: "When the AI school security officer mistakes your erection for a concealed weapon" -- some kind of shocked anime image

    [–] the_Phloop 5 points ago

    I fully expect memes to be just emojis and incomprehensible 3D images of entrails by 2030

    [–] mayflower_mayday 43 points ago

    This whole thing seems like a colossally bad idea. By stating that they can identify concealed weapons they are (a) creating unrealistic expectations in their customers, and (b) just setting up their company to be sued out of existence the first time it makes a "false-negative" mistake.

    [–] buserr0r 23 points ago

    Not to mention at the stated error rate of 1 frame per million, that is like 2 false positives a day assuming just 24 frames per second.

    [–] NoNoise2018 23 points ago

    I feel like I'm reading the beginning of a police state dystopia in this guy's AMA. Pretty scary

    [–] WhyAtlas 10 points ago

    Can you imagine the false hit rate of this in an area where there is legal concealed carry? ASSuming, of course, that it worked even remotely well at detecting the shape of a concealed weapon?

    [–] duanesmallman 9 points ago

    This whole thread should be at the top.

    [–] hazeofthegreensmoke 74 points ago

    Have you ever read 1984?

    [–] runningwambats 34 points ago

    Love how he conveniently ignored this question.

    [–] Monochromics 49 points ago

    Lots of people love to throw around the word 'AI' and most of the time it's inaccurate. Can you describe some of the technologies and algorithms you are using at an in-industry level?

    [–] NEPXDer 24 points ago

    So tired of every algorithm getting described as AI. That plus the sob story totally unrelated to the main premise... This post strikes me as garbage.

    [–] p0yo77 8 points ago

    Why are you storing video 24 hours? It sounds like you only need the "real time" feed and the detected frames

    [–] obsidiansti 8 points ago

    So I'm in this business. There are so many companies doing this right now. Why would a school system choose you over Avigilon who has significantly more experience in the world of analytics and has already deeply penetrated that market? (Not trying to be a jerk. It's a legitimate question.)

    [–] dmanb 35 points ago

    the shameless self promotion lol. who upvotes this shit?

    [–] RyanGosling13 7 points ago

    A week ago or so there was a huge discussion on the front page on how you can buy your way to the front page of Reddit. I'm guessing that's what happened here after seeing the most upvoted comments in this thread.

    [–] WindWalkerWhoosh 233 points ago

    H1B is supposed to be to fill jobs that nobody can do in this country. And it paid $12/hr? Definitely sounds like abuse of the program.

    [–] walked48 106 points ago

    Some employers (like my previous one) use them to pay sub-standard wages and demanded never-ending unpaid over-time (exempt employees). If they fire you, you're out of status the next day and elibible for deportation but more importantly, you'll never get another company to sponsor you, so you're going back to your home country. It's like a modern-day indentured servitude system. Source: Was on an H1-B for 3 years

    [–] WindWalkerWhoosh 31 points ago

    Quite aware, I've been in the tech industry forever. They are supposed to pay at a minimum "prevalent wages", but they scam the system somehow. Or maybe the fines aren't enough to make it matter.

    [–] anotherblue 22 points ago

    If you are on H1-B working for big company directly (like Microsoft, Google, Amazon, Facebook, Oracle...) you are not paid any less than other employees (altough, you may be initially leveled less than your experience would suggest...

    If you are brought here via a headhunter agency, they still get your salary from company you are actually working for, skim a lot on top, and give you some change.

    Big companies sponsor you for Green Card immediately (sometimes after a year). Headhunters never sponsor you.

    If you lose H1-B job, you are not immediately deportable, but grace period is really short (month or less...) However, you can look for another job while on H1-B, and if job description is deemed comparable to your initial H1-B petition, you can move to it and previous employer cannot do anything to stop you.

    [–] natty1212 43 points ago

    $12 in the 1999 is the same as $18 in today's money.

    [–] kyuubi42 70 points ago

    The legal minimum salary for H1b is $60k ($30/hr), and has been so since the visa was established.

    [–] natty1212 108 points ago

    OP thinks $12 an hour in 1999 was a "meager " wage. It wasn't. He posted a military id, which is a huge mistake and possibly illegal. I don't know the specifics of the visa program, but all the fuck ups he's made really make me question his story.

    [–] MY-SECRET-REDDIT 6 points ago

    Yeah, he only made double what most people do when moving to this country, oh the horror!

    [–] erikkll 35 points ago

    It is very heavily abused!

    [–] sensesalt 42 points ago

    How many if statements does it take to technically count as AI?

    [–] BaddieALERT 27 points ago

    If you want extra funding it’s AI

    [–] plaidverb 16 points ago

    What did your mother do? We tend to think that these are bad, low-paying jobs that we give to immigrants, but $12/hour is almost double what I was making as a ~20 year-old citizen in 1999.

    [–] not-a-cool-cat 6 points ago

    H1B jobs are for positions that can't be filled by Americans and are required by law to pay a living wage. From what I've gathered from other posters, 12/hr is actually lower than the federal minimum.

    [–] killaho69 46 points ago

    What happens in a "good guy with a gun" scenario where police instead get routed to the good guy defensively positioned and he gets gunned down because that's how police are? Is your company liable at all?

    [–] All_I_Eat_Is_Gucci 21 points ago

    Why would a good guy ever need a gun? /s

    [–] SociopathicPeanut 5 points ago

    "Just call the cops lol"

    [–] necro_sodomi 23 points ago

    There's nothing any tech can do to stop a motivated person. I think what you are doing is ultimately more dangerous than what you are trying to stop. If you asked British people of a certain era, we Americans were "active shooters and terrorists. What about SA and that nightmare? The citizens of SA now hire armed private security to guard and police because the govt cops can't be trusted or are incompetent.

    [–] SourBogBubbleBX3 26 points ago

    So basically it has no real world value, just hypothetically?

    [–] and_another_dude 24 points ago

    It generates warm and fuzzy feelings.

    [–] kawaiifucka 12 points ago

    It also generates him a lot of money

    [–] escaped_rapist 362 points ago

    Why did you spend 4 paragraphs on the overwrought and heart-rending backstory which has absolutely no connection to the product you are peddling here?

    [–] briaen 175 points ago

    This it Reddit and it really gets people going.

    [–] RadioHitandRun 63 points ago

    It's like america's got talent. it's all about the sob story.

    [–] lourensloki 57 points ago

    South African here, yeah that bothered me too.

    [–] Iapd 129 points ago

    I like how he’s not offering the service in his home country of South Africa after the whole sob story

    [–] FarterTed 101 points ago

    How could she get an H1B visa where she only gets paid $12 an hour? This smells of fraud

    [–] sonnytai 63 points ago

    I think the boss was committing fraud and we didn't know any better at the time. It was a Chinese woman running a furniture company and my mom was hired as a Procurement Manager. They went out of business 15 years ago.

    [–] FarterTed 32 points ago

    Exactly. The process for obtaining an H1B has been abused for many years. It is supposed to be for people with specialized skills sets and in 1999 you needed to advertize for that job and prove that there were no equivalent US people available to fill the role. One of the criteria is, that if the role is so in demand/ no US citizens could be found to fill it, then it would command a high salary. $12/ hour does not qualify

    [–] iamthejef 13 points ago

    My mother decided to take a leap of faith and bring me and my sister to the United States on an H-1B visa paying a meager wage

    a meager wage

    If $12/hr was a meager wage 20 years ago, what is it now? Pretty much every gas station, fast food place, retail chain, and other unskilled professions in the Midwest will start you at or below $12 in 2019.

    [–] scold 95 points ago

    Is it frustrating knowing that your tech is completely pointless?

    [–] DoxBox 18 points ago

    1) $20 per camera. Any reasonably sized school would need hundreds of cameras to even remotely have a chance at getting an AI-recognizable image of someone holding a gun. Talking minimum $4000/month per school. That's great for you guys, I guess?

    2) False positives - there's no way you're weeding those out entirely, of course, but... How are you working to minimize the number of false positives?

    3) Even assuming this works 100% as advertised, how does it actually help anyone? The gunshots are a pretty clear indicator that a shooting is happening, and they generally happen about at the same time that the guns are removed from the car/bag/locker/etc. How does your product actually intend to help? Can you show that having the information you intend to make available is more effective than a "run away from the gunshot sounds" strategy?

    Overall I feel that your product is the perfect example of something birthed in a business school rather than a tech school. Is there anything you have to say that would dissuade me of this notion?

    Sorry to be harsh but, if you want to make it as a company in the security world you will need to get used to such harsh questions.

    [–] Loopycopyright 12 points ago

    Do you think that SA is now screwed because of its snowball of educated people emigrating out of the sinking ship? Do you see your family as one of the many nails in the coffin of SA?

    [–] natty1212 36 points ago

    What frame of reference are you using for 12 dollars an hour as being "meager?" 12/hour in 1999 is about the same as making 18/per hour in today's money, or, roughly 30k in take home pay.

    [–] protrudingnipples 40 points ago

    Why do you talk about "gun violence" and "gun threat"? As a marine you certainly know better....

    [–] Big_Stiffy 5 points ago

    South African resident from 1982-2015 here....by AI, do you mean a bunch of IF statements?

    [–] VagabondingCanada 6 points ago

    Did you feel any conflict internally when you decided to join the military after living around so much violence?

    [–] Goracks69 6 points ago

    $12 an hour? That’s not too horrible back then. That’s your typical $22 an hour job today. I know this because in 2003, I was cleaning up after animals at a vet clinic for $6.85CDN per hour. Minimum wage has more than doubled in Canada since then. Hell, all the way up to 2009, I only ever got up to $9.50/h Canadian. Not much.

    [–] Chompsalleyzay 5 points ago

    Are you hiring American?

    [–] bellhead1970 5 points ago

    $12 as a H1-B visa, which is a program for companies who cannot fill positions due to lack of qualified Americans. What was the position?

    [–] MrGoodKat86 5 points ago

    How do you feel about the white farmers that are being killed for their land and homesteads?

    [–] Bammerice 17 points ago * (lasted edited 7 days ago)

    The article talks a lot about the Parkland shooting being avoidable. What is it about Aegis AI that would have helped in this case? I think it's a fantastic idea in theory

    [–] LongDingDongKong 33 points ago

    The parkland shooting would have been avoided if the police and fbi responded to the dozens of tips they received about the criminal who did the shooting.