Features and Capabilities

From PlexodusWiki
Revision as of 12:55, 25 October 2018 by Dredmorbius (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

A list of features and capabilities of social media platforms.


Capabilities associated with a user's own account lifecycle. Generally:

  • Creation.
  • Deletion.
  • Disabling.
  • Editing.
  • Migrating.
  • Authenticating.
  • Audience/recipients (public, private/limited, public groups, private groups..)


Capabilities associated with content created or modified by a user. Generally:

  • CRUD: Create, read, update, delete.
  • Encryption and authentication.
  • Archival.
  • Ownership and liability. Especially: copyright, unauthorised use, prohibited content.
  • Automation.

Relationship Management: Contacts, users, profiles, and groups[edit]

This is about creating, destroying, defining, and communicating relationships between individuals and/or groups.

  • Creation, review, modification, communicating, disabling, and deleting.
  • Defining associated capabilities or restrictions.
  • Temporary status or capability changes.

Group management[edit]

Includes group definition and activities, including moderation.


Notifications are associated with events. Something happens, and a message communicating this is transmitted.

Presentation: Stream, search, filtering, random access, and discovery[edit]

How is information presented.

  • Stream: a fixed (usually chronological) sequence of messages is presented.
  • Search: messages matching given generally ephemeral criteria are presented.
  • Filter: messages matching generaly persistent are presented
  • Random access: an arbitrary selection of content is presented.
  • Discovery: Tools for finding content, people, groups, events, or other items of interest.


Items associated strongly with time, often place or other attributes.


Items associated strongly with place, often with time or other attributes.

Privacy, advertising and surveillance[edit]

Participants need a good sense of who will be able to see content or information they share or who will know what they interacted with in any way. They also need to be able to trust that no one outside of those circles will be able to see or know these things.

Bad actors need to be prevented from exploiting any system to gain such access.

Law requirements might still require that such information is given to law enforcement agencies.


Federation and Syndication[edit]

Administrator and moderator capabilities[edit]

This may want to be a separate page, TMI. Dredmorbius (talk) 05:30, 13 October 2018 (CEST)


Several different things need reporting:

  • Content
    • harmful
    • illegal
    • legal, but not for an unvetted audience (NSFW, non-children-safe, illegal in certain parts of the world)
  • Participants
    • bots that pose as humans
    • spam
    • sock puppets
    • people that wilfully and continuously violate the common agreements (spam again, but also harassment etc.)


Alas, it isn't done with just reporting these things. Someone has to act on that, deal with the problem and then give feedback to the reporter and reported-on party. That can be sometimes automated, but often enough needs human judgement.

Additionally, some sort of appeal-system is often needed.

System data management[edit]

Administrator-level data management capabilities, including integrity, availability, backup, restore, replication, and legally-mandated access.

General management[edit]

Users, authentication, content, data, relationships, notifications, data authority (copyright, liability, prohibited), internal and external attacks.