nielsk / Niels Kobschätzki

There are three people in nielsk’s collective.

Huffduffed (212)

  1. Das Internet und seine Auswirkungen auf die Demokratie

    Gestern habe ich (Linus Neumann) bei der Schwarzkopf-Stiftung Junges Europa einen Vortrag über die großartigen Chancen und aktuellen Probleme des Internets für die Demokratie gehalten: Verheißung oder Bedrohung? Das Internet und seine Auswirkungen auf die Demokratie [YouTube-Link].

    Mein persönliches Highlight war die anschließende Diskussion. Die vielen jungen Anwesenden sorgten mit klugen Ergänzungen, Fragen und Einwänden für eine angeregte Diskussion – auch die Gespräche im Anschluss an die Veranstaltung waren von einem hohen Niveau. Das macht Hoffnung.

    —Huffduffed by nielsk

  2. Email Infrastructure with Chris McFadden

    A company like Pinterest has millions of transactional emails to send to people. The scalability challenges of sending high volumes of email mean that it makes more sense for most companies to use an email as a service product rather than building their own.

    Chris McFadden is the VP of engineering and cloud operations at SparkPost and he joins the show to explain the architecture of SparkPost’s email as a service product. SparkPost started as an on-premise email technology for large enterprises, and evolved into a SaaS product. In 2014, the company migrated to the cloud, which has changed its infrastructure as well as its operational model.

    —Huffduffed by nielsk

  3. Now in RPG Form Radio-Powered 80’s Cyborgs!- Nerdarchy Live Chat #164

    Nerdarchy the News Letter- http://nerdarchynewsletter.gr8.com/ On kickStarter SIGMATA: This Signal Kills Fascists- https://www.kickstarter.com/projects/2089483951/sigmata-this-signal-kills-fascists On Facebook- https://www.facebook.com/RepeatTheSignal/ Now in RPG Form Radio-Powered 80's Cyborgs!- Nerdarchy Live Chat #164 "SIGMATA: This Signal Kills Fascists" is a cyberpunk tabletop role-playing game about ethical insurgency against a fascist regime, taking place in a dystopian vision of 1980's America.

    Players assume the role of Receivers, the superheroic vanguard of the Resistance, who possess incredible powers when in range of FM radio towers emitting a mysterious number sequence called "The Signal." When the Signal is up, Receivers lead the charge against battalions of Regime infantry and armor or serve as the People's Shield, protecting mass demonstrations from the brutality of a militarized police force and neo-Nazi hooligans. When the Signal is down, however, Receivers are mere mortals, desperately fleeing from a powerful state that senses their weakness.

    It's called the Sigmata, a Signal-induced stigmata, because it is a both a blessing and a curse. At least when you're marked by the state, you can’t sit on the sidelines anymore. Please Like, Comment, Share and Subscribe…

    ===
    Original video: https://www.youtube.com/watch?time_continue=3&v=rNLrEFFWnNw
    Downloaded by http://huffduff-video.snarfed.org/ on Sun, 03 Dec 2017 06:12:19 GMT Available for 30 days after download

    Tagged with gaming

    —Huffduffed by nielsk

  4. Pattern Recognition: Review of The Sprawl, The Veil, and Headspace

    Mark talks about three different cyberpunk RPGs, all powered by the Apocalypse.

    The Sprawl http://www.ardens.org/games/the-sprawl/

    The Veil /redirect?event=videodescription&v=U0cM4wt2XHk&redirtoken=jdjFtZQg5kAhhhpL9LwXuC5_fX58MTUwNzI4NDM1N0AxNTA3MTk3OTU3&q=http%3A%2F%2Fsamjoko.storenvy.com%2Fproducts%2F18644524-the-veil-pdf

    Headspace /redirect?event=videodescription&v=U0cM4wt2XHk&redirtoken=jdjFtZQg5kAhhhpL9LwXuC5_fX58MTUwNzI4NDM1N0AxNTA3MTk3OTU3&q=http%3A%2F%2Fwww.greenhatdesigns.com%2F%3Fproject%3Dhead-space

    ===
    Original video: https://www.youtube.com/watch?v=U0cM4wt2XHk
    Downloaded by http://huffduff-video.snarfed.org/ on Thu, 05 Oct 2017 10:04:13 GMT Available for 30 days after download

    Tagged with gaming

    —Huffduffed by nielsk

  5. Steven Shorrock on the myth of human error

    The O’Reilly Security Podcast: Human error is not a root cause, studying success along with failure, and how humans make systems more resilient.In this episode, I talk with Steven Shorrock, a human factors and safety science specialist. We discuss the dangers of blaming human error, studying success along with failure, and how humans are critical to making our systems resilient.Here are some highlights:

    Humans are part of complex sociotechnical systems

    For several decades now, human error has been blamed as the primary cause of somewhere between 70% to 90% of aircraft accidents. But those statistics don’t really explain anything at all, and they don’t even make sense because all systems are composed of a number of different components. Some of those components are human—people in various positions and roles. Other components are technical—airplanes and computer systems, and so on. Some are procedural, or are soft aspects like the organizational structure. We can never, in a complex sociotechnical system, isolate one of those components as the cause of an accident, and doing so doesn't help us prevent accidents, either.

    There is no such thing as a root cause

    We have a long history of using human error as an explanation, partly because the way U.S. accident investigations and statistics are presented at the federal level highlights a primary cause. That is a little naïve (primary and secondary causes don’t really exist; that's an arbitrary line), but if investigators have to choose something, they tend to choose a cause that is closest in time and space to the accident. That is usually a person who operates some kind of control or performs some kind of action, and is at the end of a complex web of actions and decisions that goes way back to the design of the aircraft, the design of the operating procedures, the pressure that's imposed on the operators, the regulations, and so on. All of those are quite complicated and interrelated, so it's very hard to single one out as a primary cause. In fact, we should reject the very notion of a primary cause, never mind assigning the blame on human error.

    Studying successes along with failures

    If you only look at accidents or adverse events, then you're assuming that those very rare unwanted events are somehow representative of the system as a whole, but in fact, it's a concatenation of causes that come together to produce a big outcome. There's no big cause; it's just a fairly random bunch of stuff that's happened at the same time and was always there in the system. We should not just be studying when things go wrong, but also how things go well. If we accept that causes of failure are inherent in the system, then we can find them in everyday work and will discover that very often they're also the causes of success. So, we can't simply eliminate them; we've got to look deeper into it.

    Humans make our systems resilient

    Richard Cook, Ohio State University SNAFU catcher, says that the most complex sociotechnical systems are constantly in a degraded mode of operation. That means that something in that system (and usually a lot of things) is not working as it was designed. It may be that staffing numbers or competency aren’t at the level they should be, or refresher training's been cut, or the equipment may not be working right. We don't notice that our systems are constantly degraded because people stretch to connect the disparate parts of the systems that don't work right. You know that, in your system, this program doesn't work properly and you have to keep a special eye on it; or you know that this system falls down now and then, and you know when it's likely to fall down, so you keep an eye out for that. You know where the traps are in the system and, as a human being, you want the resilience, you want to stop problems from happening in the first place. The source of resilience is primarily human; it's people that make the system work.

    People can see the purpose in a system, whereas procedures can only look at a prescribed activity. In the end, we have a big gap between at least two types of work—work as imagined (what we think people do), and work as done (what people actually do)—and in that gap is all sorts of risk. We need to look at how work is actually done by being mindful of how far that's drifted from how we think it's done.

    Related resources:

    Behind Human Error

    Nine Steps to Move Forward From Error

    ‘Human Error’: The handicap of human factors, safety and justice

    Human error (position paper for NATO conference on human error)

    https://www.oreilly.com/ideas/steven-shorrock-on-the-myth-of-human-error

    —Huffduffed by nielsk

  6. WR633 Das Wahlsystem | Wer redet ist nicht tot

      Thomas Brandt ist Sozialkundelehrer und erteilt mir Politikunterricht. In der 19. Stunde lerne ich unser Wahlsystem kennen. Ausführliche Shownotes und Flattr-Buttons gibt’s in Thomas’ Blog. Podcast: Download (Duration: 32:52 — 15.1MB)

     

    Thomas Brandt ist Sozialkundelehrer und erteilt mir Politikunterricht. In der 19. Stunde lerne ich unser Wahlsystem kennen.

    Ausführliche Shownotes und Flattr-Buttons gibt’s in Thomas’ Blog.

    Podcast: Download (Duration: 32:52 — 15.1MB)

    http://www.wrint.de/2016/12/07/wr633-das-wahlsystem/

    —Huffduffed by nielsk

  7. Super-Botnetz “Avalanche” - “Hier wurde das Online-Banking sicherer gemacht”

    Mit seinen Phishing- und Spam-Attacken richteten die Hintermänner des "Avalanche"-Netzes weltweit Millionenschäden an. Nun wurden die Strukturen von deutschen Ermittlern zerschlagen. Computer-Experten loben den Erfolg. Damit sei das Internet-Banking sicherer geworden, sagte Andreas Bogk vom Chaos Computer Club im DLF.

    http://www.deutschlandfunk.de/super-botnetz-avalanche-hier-wurde-das-online-banking.694.de.html?dram:article_id=372899

    —Huffduffed by nielsk

Page 1 of 22Older